{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# COMPSCI 389: Introduction to Machine Learning\n", "# Classification\n", "\n", "So far we have focussed on **regression** problems in supervised learning. In this notebook, we discuss how the ideas we have discussed carry over to the **classification** setting. Recall that in the classification setting the labels are discrete (nominal or ordinal) values rather than real-valued numbers.\n", "\n", "The conversion from regression to classification is relatively straightforward. We must:\n", "\n", "1. Change the parametric model being used so that it outputs a discrete label as a prediction rather than a real number.\n", "2. Select a loss function that is appropriate for classification tasks.\n", "\n", "**Note**: There are non-parametric ML methods for classification, like decision trees, which are beyond the scope of this course." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Parametric Models for Classification\n", "\n", "If there are $m$ possible values for the label, then parametric models for classification typically have $m$ outputs rather than one output. For an *artificial neural network* (ANN), this means that the output layer has one unit for each possible label.\n", "\n", "Note that we will refer to each possible label as a **class**.\n", "\n", "There are two standard ways of mapping these $m$ outputs to a specific label prediction:\n", "\n", "1. **Deterministic**: The class with the highest output value is chosen as the predicted class. This approach is simple, but it results in the loss function typically having a gradient of zero. This is because, if the output associated with one class is $1$ and the output associated with another class is $1.5$, small changes to the weights may change these values slightly, but not enough to alter the resulting prediction. Hence small changes to the weights do not change the predictions, and hence do not change the loss.\n", "\n", "2. **Stochastic**: The $m$ outputs are used to create a probability distribution over the possible label values, and the predicted label is sampled from this distribution. The most common way of doing this is using the **softmax** function." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Softmax Function\n", "\n", "The **softmax** function doesn't just select the largest of the $m$ values (like \"max\" would), but rather selects the largest most often. However, there are some challenges to convert the $m$ output values, $\\text{out}_1, \\text{out}_2, \\dotsc, \\text{out}_m$ into probabilities for labels $1, 2, \\dotsc, m$.\n", "\n", "First, the values $out_k$ (for $k \\in \\{1,\\dotsc,m\\}$) can be negative, while probabilities cannot be negative. To resolve this, we exponentiate each of the $\\text{out}_k$ values. This ensures that all are at least zero. We now have $m$ values, $e^{\\text{out}_1}, e^{\\text{out}_2}, \\dotsc, e^{\\text{out}_m}$.\n", "\n", "While all of these are positive, they do not necessarily sum to one. For any probability distribution, the sum of the probabilities of all possible outcomes must be one. To fix this, we divide each output by the sum of the outputs. This gives:\n", "\n", "$$\n", "\\frac{e^{\\text{out}_1}}{\\sum_{k=1}^m e^{\\text{out}_k}}, \\frac{e^{\\text{out}_2}}{\\sum_{k=1}^m e^{\\text{out}_k}}, \\dotsc, \\frac{e^{\\text{out}_m}}{\\sum_{k=1}^m e^{\\text{out}_k}}.\n", "$$\n", "\n", "We can now view these as probabilities! That is, the probability that the prediction is the $\\hat y^\\text{th}$ label class is:\n", "$$\n", "\\Pr(\\hat Y_i = \\hat y) = \\frac{e^{\\text{out}_{\\hat y}}}{\\sum_{k=1}^m e^{\\text{out}_k}}.\n", "$$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Binary Classification\n", "\n", "Note that binary classification is a special case. When there are only two possible labels, we do not need a network (parametric model) with two outputs. Instead, we can use a single output that is used to represent the probability of one of the classes (usually the \"positive\" class). The probability of the other class is one minus the probability of the positive class.\n", "\n", "In these cases, we must ensure that the one output of the network can be viewed as a probability. That is, it must be between zero and one. Hence, a sigmoid function like the logistic function is usually applied to the output of the parametric model to squash it to the range $(0,1)$.\n", "\n", "Hence, if the network only has one output, $\\text{out}_1$:\n", "\n", "$$\n", "\\Pr(\\hat Y_i = 1) = \\sigma(\\text{out}_1),\n", "$$\n", "where $\\sigma(z) = \\frac{1}{1+e^{-z}}$, and\n", "$$\n", "\\Pr(\\hat Y_i = 0) = 1 - \\Pr(\\hat Y_i = 1).\n", "$$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Cross-Entropy Loss (Log Loss)\n", "\n", "The most common loss function used for classification is **cross-entropy loss** (also called **log loss**). \n", "\n", "In both the binary and multiclass classification settings, the Cross-Entropy Loss is:\n", "$$\n", "\\text{Cross-Entropy Loss}(w,D) = -\\frac{1}{n}\\sum_{i=1}^n \\ln \\Big (\\Pr(Y_i = \\hat Y_i)\\Big ).\n", "$$\n", "That is, the sum over all $n$ points, of the natural logarithm of the probability that the model selects the correct label. The inclusion of $\\frac{1}{n}$ does not make a significant difference, since it simply re-scaled the loss function (it may or may not be included).\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Logistic Regression\n", "\n", "Logistic regression is a particularly common method for classification. It is the \"linear parametric model\" of binary classification. Specifically, the probability that it outputs a label of $1$ is:\n", "$$\n", "\\Pr(\\hat Y_i=1 | X_i)=\\frac{1}{1+e^{-w\\cdot \\phi(X_i)}}=\\sigma(w \\cdot \\phi(X_i)).\n", "$$\n", "Notice that this is simply the logistic function (sigmoid) applied to a linear parametric model, mapping its output to the range $(0,1)$ so that it can be viewed as a probability (the probability of the positive label). This parametric model for binary classification is called the **logistic model** or **logit model**.\n", "\n", "Logistic regression models are typically trained to maximize the \"likelihood\" of the observed data (the precise definition of likelihood in this context is beyond the scope of this class). It can be shown that this is precisely equivalent to minimizing the Cross-Entropy Loss. **Hence, logistic regression equates to using a logit model and the cross-entropy loss.**" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Example: Iris Data Set\n", "\n", "We can now train parametric models for classification! In the following code we train an ANN on the Iris data set using cross-entropy loss." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "import torch\n", "import torch.nn as nn\n", "import torch.optim as optim\n", "from sklearn.datasets import load_iris\n", "from sklearn.model_selection import train_test_split\n", "import matplotlib.pyplot as plt\n", "\n", "# Load the Iris dataset\n", "iris = load_iris()\n", "X = iris.data\n", "y = iris.target\n", "\n", "# Convert to PyTorch tensors\n", "X_tensor = torch.tensor(X, dtype=torch.float32)\n", "y_tensor = torch.tensor(y, dtype=torch.long) # NOTE: The labels are now integers\n", "\n", "# Train/test split\n", "X_train, X_test, y_train, y_test = train_test_split(X_tensor, y_tensor, test_size=0.5, random_state=42)\n", "\n", "# Define the ANN model\n", "class ANN(nn.Module):\n", " def __init__(self):\n", " super(ANN, self).__init__()\n", " self.fc1 = nn.Linear(4, 10) # 4 input features, 10 hidden nodes\n", " self.fc2 = nn.Linear(10, 3) # 3 output classes NOTE: One output per class\n", "\n", " def forward(self, x):\n", " x = torch.relu(self.fc1(x))\n", " x = self.fc2(x)\n", " return x" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/10000, Training Loss: 1.3809908628463745, Test Loss: 1.386336326599121\n", "Epoch 2/10000, Training Loss: 1.3658846616744995, Test Loss: 1.3741258382797241\n", "Epoch 3/10000, Training Loss: 1.3511279821395874, Test Loss: 1.3622355461120605\n", "Epoch 4/10000, Training Loss: 1.3367257118225098, Test Loss: 1.350675344467163\n", "Epoch 5/10000, Training Loss: 1.3226876258850098, Test Loss: 1.3394553661346436\n", "Epoch 6/10000, Training Loss: 1.3090260028839111, Test Loss: 1.3285852670669556\n", "Epoch 7/10000, Training Loss: 1.2957524061203003, Test Loss: 1.3180738687515259\n", "Epoch 8/10000, Training Loss: 1.2828891277313232, Test Loss: 1.30794358253479\n", "Epoch 9/10000, Training Loss: 1.270450472831726, Test Loss: 1.2982019186019897\n", "Epoch 10/10000, Training Loss: 1.2584359645843506, Test Loss: 1.2888445854187012\n", "Epoch 11/10000, Training Loss: 1.2468583583831787, Test Loss: 1.2798775434494019\n", "Epoch 12/10000, Training Loss: 1.2357237339019775, Test Loss: 1.2712684869766235\n", "Epoch 13/10000, Training Loss: 1.225046992301941, Test Loss: 1.2630517482757568\n", "Epoch 14/10000, Training Loss: 1.2148419618606567, Test Loss: 1.2552542686462402\n", "Epoch 15/10000, Training Loss: 1.2050831317901611, Test Loss: 1.2478312253952026\n", "Epoch 16/10000, Training Loss: 1.1957311630249023, Test Loss: 1.240745186805725\n", "Epoch 17/10000, Training Loss: 1.1868141889572144, Test Loss: 1.2339537143707275\n", "Epoch 18/10000, Training Loss: 1.1783475875854492, Test Loss: 1.2274322509765625\n", "Epoch 19/10000, Training Loss: 1.170263409614563, Test Loss: 1.2212125062942505\n", "Epoch 20/10000, Training Loss: 1.1625360250473022, Test Loss: 1.215250015258789\n", "Epoch 21/10000, Training Loss: 1.155208945274353, Test Loss: 1.2095344066619873\n", "Epoch 22/10000, Training Loss: 1.1482043266296387, Test Loss: 1.2040331363677979\n", "Epoch 23/10000, Training Loss: 1.141471028327942, Test Loss: 1.1987659931182861\n", "Epoch 24/10000, Training Loss: 1.1349372863769531, Test Loss: 1.1936908960342407\n", "Epoch 25/10000, Training Loss: 1.1287559270858765, Test Loss: 1.188794493675232\n", "Epoch 26/10000, Training Loss: 1.1229029893875122, Test Loss: 1.1841073036193848\n", "Epoch 27/10000, Training Loss: 1.1173758506774902, Test Loss: 1.1796002388000488\n", "Epoch 28/10000, Training Loss: 1.1120402812957764, Test Loss: 1.175299048423767\n", "Epoch 29/10000, Training Loss: 1.1069101095199585, Test Loss: 1.1711734533309937\n", "Epoch 30/10000, Training Loss: 1.101994514465332, Test Loss: 1.16719388961792\n", "Epoch 31/10000, Training Loss: 1.0973244905471802, Test Loss: 1.16343092918396\n", "Epoch 32/10000, Training Loss: 1.0929901599884033, Test Loss: 1.1598215103149414\n", "Epoch 33/10000, Training Loss: 1.088882327079773, Test Loss: 1.1562923192977905\n", "Epoch 34/10000, Training Loss: 1.084916114807129, Test Loss: 1.152842402458191\n", "Epoch 35/10000, Training Loss: 1.0810315608978271, Test Loss: 1.14949631690979\n", "Epoch 36/10000, Training Loss: 1.077394962310791, Test Loss: 1.1462568044662476\n", "Epoch 37/10000, Training Loss: 1.0739816427230835, Test Loss: 1.1431384086608887\n", "Epoch 38/10000, Training Loss: 1.0707802772521973, Test Loss: 1.1401045322418213\n", "Epoch 39/10000, Training Loss: 1.0677461624145508, Test Loss: 1.1371077299118042\n", "Epoch 40/10000, Training Loss: 1.0648547410964966, Test Loss: 1.1342254877090454\n", "Epoch 41/10000, Training Loss: 1.062059760093689, Test Loss: 1.1314038038253784\n", "Epoch 42/10000, Training Loss: 1.059457778930664, Test Loss: 1.1285767555236816\n", "Epoch 43/10000, Training Loss: 1.0569462776184082, Test Loss: 1.12576162815094\n", "Epoch 44/10000, Training Loss: 1.054488182067871, Test Loss: 1.122958779335022\n", "Epoch 45/10000, Training Loss: 1.0520708560943604, Test Loss: 1.120145559310913\n", "Epoch 46/10000, Training Loss: 1.0496760606765747, Test Loss: 1.1173540353775024\n", "Epoch 47/10000, Training Loss: 1.0472996234893799, Test Loss: 1.1145446300506592\n", "Epoch 48/10000, Training Loss: 1.0449378490447998, Test Loss: 1.1117123365402222\n", "Epoch 49/10000, Training Loss: 1.0425877571105957, Test Loss: 1.10885751247406\n", "Epoch 50/10000, Training Loss: 1.0402470827102661, Test Loss: 1.1059807538986206\n", "Epoch 51/10000, Training Loss: 1.0379135608673096, Test Loss: 1.10308837890625\n", "Epoch 52/10000, Training Loss: 1.0355862379074097, Test Loss: 1.1001865863800049\n", "Epoch 53/10000, Training Loss: 1.0332638025283813, Test Loss: 1.0972647666931152\n", "Epoch 54/10000, Training Loss: 1.0309453010559082, Test Loss: 1.094325065612793\n", "Epoch 55/10000, Training Loss: 1.0286304950714111, Test Loss: 1.091378092765808\n", "Epoch 56/10000, Training Loss: 1.0263175964355469, Test Loss: 1.0884160995483398\n", "Epoch 57/10000, Training Loss: 1.024007797241211, Test Loss: 1.0854419469833374\n", "Epoch 58/10000, Training Loss: 1.0217033624649048, Test Loss: 1.0824614763259888\n", "Epoch 59/10000, Training Loss: 1.019412875175476, Test Loss: 1.0794787406921387\n", "Epoch 60/10000, Training Loss: 1.017126202583313, Test Loss: 1.0764915943145752\n", "Epoch 61/10000, Training Loss: 1.0148438215255737, Test Loss: 1.0734997987747192\n", "Epoch 62/10000, Training Loss: 1.0125654935836792, Test Loss: 1.0705045461654663\n", "Epoch 63/10000, Training Loss: 1.010291337966919, Test Loss: 1.0675076246261597\n", "Epoch 64/10000, Training Loss: 1.0080217123031616, Test Loss: 1.0645114183425903\n", "Epoch 65/10000, Training Loss: 1.0057566165924072, Test Loss: 1.061524510383606\n", "Epoch 66/10000, Training Loss: 1.0034959316253662, Test Loss: 1.0585399866104126\n", "Epoch 67/10000, Training Loss: 1.0012398958206177, Test Loss: 1.055558443069458\n", "Epoch 68/10000, Training Loss: 0.9989884495735168, Test Loss: 1.0525811910629272\n", "Epoch 69/10000, Training Loss: 0.9967412352561951, Test Loss: 1.0496087074279785\n", "Epoch 70/10000, Training Loss: 0.9944980144500732, Test Loss: 1.0466415882110596\n", "Epoch 71/10000, Training Loss: 0.9922621846199036, Test Loss: 1.0436809062957764\n", "Epoch 72/10000, Training Loss: 0.9900366067886353, Test Loss: 1.0407289266586304\n", "Epoch 73/10000, Training Loss: 0.9878225922584534, Test Loss: 1.0377998352050781\n", "Epoch 74/10000, Training Loss: 0.9856152534484863, Test Loss: 1.0348790884017944\n", "Epoch 75/10000, Training Loss: 0.9834204912185669, Test Loss: 1.0319788455963135\n", "Epoch 76/10000, Training Loss: 0.9812342524528503, Test Loss: 1.02909255027771\n", "Epoch 77/10000, Training Loss: 0.9790571331977844, Test Loss: 1.0262151956558228\n", "Epoch 78/10000, Training Loss: 0.976897656917572, Test Loss: 1.0233534574508667\n", "Epoch 79/10000, Training Loss: 0.9747523069381714, Test Loss: 1.0205047130584717\n", "Epoch 80/10000, Training Loss: 0.9726132154464722, Test Loss: 1.0176752805709839\n", "Epoch 81/10000, Training Loss: 0.9704791307449341, Test Loss: 1.0148617029190063\n", "Epoch 82/10000, Training Loss: 0.9683582782745361, Test Loss: 1.0120654106140137\n", "Epoch 83/10000, Training Loss: 0.9662498235702515, Test Loss: 1.009286880493164\n", "Epoch 84/10000, Training Loss: 0.9641556739807129, Test Loss: 1.006518006324768\n", "Epoch 85/10000, Training Loss: 0.9620760083198547, Test Loss: 1.0037589073181152\n", "Epoch 86/10000, Training Loss: 0.9600440263748169, Test Loss: 1.0010179281234741\n", "Epoch 87/10000, Training Loss: 0.9580211639404297, Test Loss: 0.9983005523681641\n", "Epoch 88/10000, Training Loss: 0.9560044407844543, Test Loss: 0.995654284954071\n", "Epoch 89/10000, Training Loss: 0.954007089138031, Test Loss: 0.9930693507194519\n", "Epoch 90/10000, Training Loss: 0.952023446559906, Test Loss: 0.990500271320343\n", "Epoch 91/10000, Training Loss: 0.9500529170036316, Test Loss: 0.987945020198822\n", "Epoch 92/10000, Training Loss: 0.9480926394462585, Test Loss: 0.9854028224945068\n", "Epoch 93/10000, Training Loss: 0.9461409449577332, Test Loss: 0.9828782081604004\n", "Epoch 94/10000, Training Loss: 0.9441965818405151, Test Loss: 0.9803870916366577\n", "Epoch 95/10000, Training Loss: 0.9422556757926941, Test Loss: 0.9779266119003296\n", "Epoch 96/10000, Training Loss: 0.9403175115585327, Test Loss: 0.9754745364189148\n", "Epoch 97/10000, Training Loss: 0.938381552696228, Test Loss: 0.9730270504951477\n", "Epoch 98/10000, Training Loss: 0.9364473223686218, Test Loss: 0.9705836176872253\n", "Epoch 99/10000, Training Loss: 0.9345272779464722, Test Loss: 0.9681454300880432\n", "Epoch 100/10000, Training Loss: 0.932615339756012, Test Loss: 0.9657156467437744\n", "Epoch 101/10000, Training Loss: 0.9307209253311157, Test Loss: 0.9633036255836487\n", "Epoch 102/10000, Training Loss: 0.9288306832313538, Test Loss: 0.9609281420707703\n", "Epoch 103/10000, Training Loss: 0.9269415140151978, Test Loss: 0.958571195602417\n", "Epoch 104/10000, Training Loss: 0.9250530004501343, Test Loss: 0.9562159180641174\n", "Epoch 105/10000, Training Loss: 0.9231647849082947, Test Loss: 0.953862190246582\n", "Epoch 106/10000, Training Loss: 0.921276330947876, Test Loss: 0.951511025428772\n", "Epoch 107/10000, Training Loss: 0.9194056987762451, Test Loss: 0.9491688013076782\n", "Epoch 108/10000, Training Loss: 0.9175711274147034, Test Loss: 0.9468562602996826\n", "Epoch 109/10000, Training Loss: 0.9157513380050659, Test Loss: 0.9445947408676147\n", "Epoch 110/10000, Training Loss: 0.9139513969421387, Test Loss: 0.9423452019691467\n", "Epoch 111/10000, Training Loss: 0.9121545553207397, Test Loss: 0.9401350021362305\n", "Epoch 112/10000, Training Loss: 0.9103606343269348, Test Loss: 0.9379702806472778\n", "Epoch 113/10000, Training Loss: 0.9085760712623596, Test Loss: 0.9358322024345398\n", "Epoch 114/10000, Training Loss: 0.9068233966827393, Test Loss: 0.9337095022201538\n", "Epoch 115/10000, Training Loss: 0.9050995111465454, Test Loss: 0.9316090941429138\n", "Epoch 116/10000, Training Loss: 0.9033741354942322, Test Loss: 0.9295134544372559\n", "Epoch 117/10000, Training Loss: 0.9016475677490234, Test Loss: 0.9274219870567322\n", "Epoch 118/10000, Training Loss: 0.8999196290969849, Test Loss: 0.925337016582489\n", "Epoch 119/10000, Training Loss: 0.8981980681419373, Test Loss: 0.9232671856880188\n", "Epoch 120/10000, Training Loss: 0.8964941501617432, Test Loss: 0.9212009906768799\n", "Epoch 121/10000, Training Loss: 0.8947945237159729, Test Loss: 0.9191389679908752\n", "Epoch 122/10000, Training Loss: 0.8930950164794922, Test Loss: 0.9170821905136108\n", "Epoch 123/10000, Training Loss: 0.8913940191268921, Test Loss: 0.9150291681289673\n", "Epoch 124/10000, Training Loss: 0.8896867632865906, Test Loss: 0.9129793047904968\n", "Epoch 125/10000, Training Loss: 0.8879741430282593, Test Loss: 0.9109311103820801\n", "Epoch 126/10000, Training Loss: 0.8862588405609131, Test Loss: 0.9088835716247559\n", "Epoch 127/10000, Training Loss: 0.8845416307449341, Test Loss: 0.9068356156349182\n", "Epoch 128/10000, Training Loss: 0.8828215599060059, Test Loss: 0.9047867059707642\n", "Epoch 129/10000, Training Loss: 0.8810994625091553, Test Loss: 0.9027352929115295\n", "Epoch 130/10000, Training Loss: 0.8793752789497375, Test Loss: 0.900680661201477\n", "Epoch 131/10000, Training Loss: 0.8776493072509766, Test Loss: 0.8986216187477112\n", "Epoch 132/10000, Training Loss: 0.8759263157844543, Test Loss: 0.8965546488761902\n", "Epoch 133/10000, Training Loss: 0.8742046356201172, Test Loss: 0.8944770097732544\n", "Epoch 134/10000, Training Loss: 0.8724860548973083, Test Loss: 0.8923914432525635\n", "Epoch 135/10000, Training Loss: 0.8707664012908936, Test Loss: 0.8902966380119324\n", "Epoch 136/10000, Training Loss: 0.8690447211265564, Test Loss: 0.8881890773773193\n", "Epoch 137/10000, Training Loss: 0.8673207759857178, Test Loss: 0.8860688209533691\n", "Epoch 138/10000, Training Loss: 0.8655945062637329, Test Loss: 0.8839356303215027\n", "Epoch 139/10000, Training Loss: 0.8638657927513123, Test Loss: 0.8817896246910095\n", "Epoch 140/10000, Training Loss: 0.8621340990066528, Test Loss: 0.8796310424804688\n", "Epoch 141/10000, Training Loss: 0.860399603843689, Test Loss: 0.8774608373641968\n", "Epoch 142/10000, Training Loss: 0.8586618304252625, Test Loss: 0.8752791285514832\n", "Epoch 143/10000, Training Loss: 0.8569208979606628, Test Loss: 0.873084306716919\n", "Epoch 144/10000, Training Loss: 0.8551759719848633, Test Loss: 0.8708769679069519\n", "Epoch 145/10000, Training Loss: 0.8534273505210876, Test Loss: 0.8686599731445312\n", "Epoch 146/10000, Training Loss: 0.8516750931739807, Test Loss: 0.8664335012435913\n", "Epoch 147/10000, Training Loss: 0.8499193787574768, Test Loss: 0.8641965985298157\n", "Epoch 148/10000, Training Loss: 0.8481606245040894, Test Loss: 0.861968457698822\n", "Epoch 149/10000, Training Loss: 0.8464009761810303, Test Loss: 0.8597383499145508\n", "Epoch 150/10000, Training Loss: 0.8446369767189026, Test Loss: 0.857499361038208\n", "Epoch 151/10000, Training Loss: 0.8428685069084167, Test Loss: 0.8552515506744385\n", "Epoch 152/10000, Training Loss: 0.8410941958427429, Test Loss: 0.8529934883117676\n", "Epoch 153/10000, Training Loss: 0.8393161296844482, Test Loss: 0.8507273197174072\n", "Epoch 154/10000, Training Loss: 0.8375341892242432, Test Loss: 0.84845370054245\n", "Epoch 155/10000, Training Loss: 0.8357486128807068, Test Loss: 0.8461728692054749\n", "Epoch 156/10000, Training Loss: 0.8339594006538391, Test Loss: 0.8438857793807983\n", "Epoch 157/10000, Training Loss: 0.8321666121482849, Test Loss: 0.8415915966033936\n", "Epoch 158/10000, Training Loss: 0.830370306968689, Test Loss: 0.8392894864082336\n", "Epoch 159/10000, Training Loss: 0.8285704255104065, Test Loss: 0.83697909116745\n", "Epoch 160/10000, Training Loss: 0.8267672061920166, Test Loss: 0.8346632122993469\n", "Epoch 161/10000, Training Loss: 0.824960470199585, Test Loss: 0.8323423862457275\n", "Epoch 162/10000, Training Loss: 0.8231483697891235, Test Loss: 0.8300215601921082\n", "Epoch 163/10000, Training Loss: 0.8213317394256592, Test Loss: 0.8276945948600769\n", "Epoch 164/10000, Training Loss: 0.8195114135742188, Test Loss: 0.8253628611564636\n", "Epoch 165/10000, Training Loss: 0.8176875114440918, Test Loss: 0.8230248093605042\n", "Epoch 166/10000, Training Loss: 0.8158608078956604, Test Loss: 0.8206862807273865\n", "Epoch 167/10000, Training Loss: 0.8140326142311096, Test Loss: 0.8183495998382568\n", "Epoch 168/10000, Training Loss: 0.8121908903121948, Test Loss: 0.8160142302513123\n", "Epoch 169/10000, Training Loss: 0.8103421926498413, Test Loss: 0.8136845231056213\n", "Epoch 170/10000, Training Loss: 0.8084867596626282, Test Loss: 0.8113704323768616\n", "Epoch 171/10000, Training Loss: 0.8066255450248718, Test Loss: 0.8090707063674927\n", "Epoch 172/10000, Training Loss: 0.8047682642936707, Test Loss: 0.8068103790283203\n", "Epoch 173/10000, Training Loss: 0.8029075860977173, Test Loss: 0.8045673370361328\n", "Epoch 174/10000, Training Loss: 0.8010383248329163, Test Loss: 0.8023163080215454\n", "Epoch 175/10000, Training Loss: 0.7991548776626587, Test Loss: 0.8000842332839966\n", "Epoch 176/10000, Training Loss: 0.7972729206085205, Test Loss: 0.7978449463844299\n", "Epoch 177/10000, Training Loss: 0.7954075336456299, Test Loss: 0.795600175857544\n", "Epoch 178/10000, Training Loss: 0.7935437560081482, Test Loss: 0.793371856212616\n", "Epoch 179/10000, Training Loss: 0.7917223572731018, Test Loss: 0.7911838889122009\n", "Epoch 180/10000, Training Loss: 0.7899159789085388, Test Loss: 0.7890138030052185\n", "Epoch 181/10000, Training Loss: 0.7881148457527161, Test Loss: 0.7868789434432983\n", "Epoch 182/10000, Training Loss: 0.7863130569458008, Test Loss: 0.7847724556922913\n", "Epoch 183/10000, Training Loss: 0.7845100164413452, Test Loss: 0.7826730012893677\n", "Epoch 184/10000, Training Loss: 0.7827056646347046, Test Loss: 0.78057861328125\n", "Epoch 185/10000, Training Loss: 0.7809228301048279, Test Loss: 0.7785091996192932\n", "Epoch 186/10000, Training Loss: 0.7791635394096375, Test Loss: 0.7764469385147095\n", "Epoch 187/10000, Training Loss: 0.7774115800857544, Test Loss: 0.7743908762931824\n", "Epoch 188/10000, Training Loss: 0.7756675481796265, Test Loss: 0.7723404169082642\n", "Epoch 189/10000, Training Loss: 0.7739348411560059, Test Loss: 0.7702988386154175\n", "Epoch 190/10000, Training Loss: 0.7722043991088867, Test Loss: 0.7682685852050781\n", "Epoch 191/10000, Training Loss: 0.7704911828041077, Test Loss: 0.7662442326545715\n", "Epoch 192/10000, Training Loss: 0.7688028812408447, Test Loss: 0.7642276287078857\n", "Epoch 193/10000, Training Loss: 0.7671403288841248, Test Loss: 0.7622206211090088\n", "Epoch 194/10000, Training Loss: 0.7655008435249329, Test Loss: 0.7602381110191345\n", "Epoch 195/10000, Training Loss: 0.7638914585113525, Test Loss: 0.7583057880401611\n", "Epoch 196/10000, Training Loss: 0.7622838020324707, Test Loss: 0.7563934326171875\n", "Epoch 197/10000, Training Loss: 0.7606790065765381, Test Loss: 0.7545084357261658\n", "Epoch 198/10000, Training Loss: 0.759084165096283, Test Loss: 0.7526692152023315\n", "Epoch 199/10000, Training Loss: 0.7575022578239441, Test Loss: 0.7508462071418762\n", "Epoch 200/10000, Training Loss: 0.7559260725975037, Test Loss: 0.7490402460098267\n", "Epoch 201/10000, Training Loss: 0.754356861114502, Test Loss: 0.747280478477478\n", "Epoch 202/10000, Training Loss: 0.7527948617935181, Test Loss: 0.7455354332923889\n", "Epoch 203/10000, Training Loss: 0.7512473464012146, Test Loss: 0.7438017725944519\n", "Epoch 204/10000, Training Loss: 0.7497026324272156, Test Loss: 0.7420923113822937\n", "Epoch 205/10000, Training Loss: 0.7481685280799866, Test Loss: 0.7404244542121887\n", "Epoch 206/10000, Training Loss: 0.7466498613357544, Test Loss: 0.738785982131958\n", "Epoch 207/10000, Training Loss: 0.7451339960098267, Test Loss: 0.7371602654457092\n", "Epoch 208/10000, Training Loss: 0.74363112449646, Test Loss: 0.7355538010597229\n", "Epoch 209/10000, Training Loss: 0.742136538028717, Test Loss: 0.7339748740196228\n", "Epoch 210/10000, Training Loss: 0.7406563758850098, Test Loss: 0.7324193120002747\n", "Epoch 211/10000, Training Loss: 0.7391889691352844, Test Loss: 0.7308750152587891\n", "Epoch 212/10000, Training Loss: 0.7377267479896545, Test Loss: 0.729340136051178\n", "Epoch 213/10000, Training Loss: 0.7362731099128723, Test Loss: 0.7278152704238892\n", "Epoch 214/10000, Training Loss: 0.7348300218582153, Test Loss: 0.7262983322143555\n", "Epoch 215/10000, Training Loss: 0.7333983778953552, Test Loss: 0.7247909307479858\n", "Epoch 216/10000, Training Loss: 0.7319709658622742, Test Loss: 0.7232912182807922\n", "Epoch 217/10000, Training Loss: 0.7305479049682617, Test Loss: 0.7218003273010254\n", "Epoch 218/10000, Training Loss: 0.729128897190094, Test Loss: 0.720317542552948\n", "Epoch 219/10000, Training Loss: 0.7277146577835083, Test Loss: 0.7188422083854675\n", "Epoch 220/10000, Training Loss: 0.7263022661209106, Test Loss: 0.7173718810081482\n", "Epoch 221/10000, Training Loss: 0.7248936891555786, Test Loss: 0.7159085869789124\n", "Epoch 222/10000, Training Loss: 0.7234907150268555, Test Loss: 0.7144534587860107\n", "Epoch 223/10000, Training Loss: 0.722091555595398, Test Loss: 0.7130035161972046\n", "Epoch 224/10000, Training Loss: 0.7206965088844299, Test Loss: 0.7115560173988342\n", "Epoch 225/10000, Training Loss: 0.7193090915679932, Test Loss: 0.710106372833252\n", "Epoch 226/10000, Training Loss: 0.717930257320404, Test Loss: 0.7086551785469055\n", "Epoch 227/10000, Training Loss: 0.7165588140487671, Test Loss: 0.7071971297264099\n", "Epoch 228/10000, Training Loss: 0.7151914238929749, Test Loss: 0.7057301998138428\n", "Epoch 229/10000, Training Loss: 0.7138267755508423, Test Loss: 0.704253613948822\n", "Epoch 230/10000, Training Loss: 0.7124646306037903, Test Loss: 0.702767014503479\n", "Epoch 231/10000, Training Loss: 0.7111042737960815, Test Loss: 0.7012698650360107\n", "Epoch 232/10000, Training Loss: 0.7097489237785339, Test Loss: 0.6997626423835754\n", "Epoch 233/10000, Training Loss: 0.7083958387374878, Test Loss: 0.6982479095458984\n", "Epoch 234/10000, Training Loss: 0.707044243812561, Test Loss: 0.6967236399650574\n", "Epoch 235/10000, Training Loss: 0.7056965827941895, Test Loss: 0.6951932907104492\n", "Epoch 236/10000, Training Loss: 0.704352617263794, Test Loss: 0.6936569809913635\n", "Epoch 237/10000, Training Loss: 0.703009843826294, Test Loss: 0.6921142339706421\n", "Epoch 238/10000, Training Loss: 0.7016681432723999, Test Loss: 0.6905655264854431\n", "Epoch 239/10000, Training Loss: 0.7003279328346252, Test Loss: 0.6890071034431458\n", "Epoch 240/10000, Training Loss: 0.69898921251297, Test Loss: 0.6874377727508545\n", "Epoch 241/10000, Training Loss: 0.6976535320281982, Test Loss: 0.6858633756637573\n", "Epoch 242/10000, Training Loss: 0.6963189840316772, Test Loss: 0.6842867732048035\n", "Epoch 243/10000, Training Loss: 0.6949850916862488, Test Loss: 0.6827057600021362\n", "Epoch 244/10000, Training Loss: 0.6936530470848083, Test Loss: 0.681125819683075\n", "Epoch 245/10000, Training Loss: 0.6923239231109619, Test Loss: 0.6795476078987122\n", "Epoch 246/10000, Training Loss: 0.690995454788208, Test Loss: 0.6779707074165344\n", "Epoch 247/10000, Training Loss: 0.6896682381629944, Test Loss: 0.6763996481895447\n", "Epoch 248/10000, Training Loss: 0.6883435845375061, Test Loss: 0.6748352646827698\n", "Epoch 249/10000, Training Loss: 0.6870213150978088, Test Loss: 0.6732761859893799\n", "Epoch 250/10000, Training Loss: 0.6856998205184937, Test Loss: 0.671720027923584\n", "Epoch 251/10000, Training Loss: 0.6843788623809814, Test Loss: 0.6701664924621582\n", "Epoch 252/10000, Training Loss: 0.6830587983131409, Test Loss: 0.6686146259307861\n", "Epoch 253/10000, Training Loss: 0.6817396283149719, Test Loss: 0.6670644879341125\n", "Epoch 254/10000, Training Loss: 0.6804215312004089, Test Loss: 0.6655150055885315\n", "Epoch 255/10000, Training Loss: 0.6791041493415833, Test Loss: 0.6639659404754639\n", "Epoch 256/10000, Training Loss: 0.6777876019477844, Test Loss: 0.662417471408844\n", "Epoch 257/10000, Training Loss: 0.6764726042747498, Test Loss: 0.6608659625053406\n", "Epoch 258/10000, Training Loss: 0.6751598715782166, Test Loss: 0.6593116521835327\n", "Epoch 259/10000, Training Loss: 0.6738479733467102, Test Loss: 0.6577550172805786\n", "Epoch 260/10000, Training Loss: 0.6725367903709412, Test Loss: 0.6561954617500305\n", "Epoch 261/10000, Training Loss: 0.6712263822555542, Test Loss: 0.6546332836151123\n", "Epoch 262/10000, Training Loss: 0.6699171662330627, Test Loss: 0.6530695557594299\n", "Epoch 263/10000, Training Loss: 0.668609082698822, Test Loss: 0.6515069603919983\n", "Epoch 264/10000, Training Loss: 0.6673020720481873, Test Loss: 0.6499451994895935\n", "Epoch 265/10000, Training Loss: 0.6659960746765137, Test Loss: 0.6483848094940186\n", "Epoch 266/10000, Training Loss: 0.6646913290023804, Test Loss: 0.6468255519866943\n", "Epoch 267/10000, Training Loss: 0.6633877158164978, Test Loss: 0.6452675461769104\n", "Epoch 268/10000, Training Loss: 0.6620847582817078, Test Loss: 0.6437106132507324\n", "Epoch 269/10000, Training Loss: 0.6607831716537476, Test Loss: 0.642153799533844\n", "Epoch 270/10000, Training Loss: 0.6594826579093933, Test Loss: 0.640597403049469\n", "Epoch 271/10000, Training Loss: 0.658183217048645, Test Loss: 0.6390413641929626\n", "Epoch 272/10000, Training Loss: 0.6568848490715027, Test Loss: 0.637485682964325\n", "Epoch 273/10000, Training Loss: 0.6555880904197693, Test Loss: 0.6359272003173828\n", "Epoch 274/10000, Training Loss: 0.6542922258377075, Test Loss: 0.6343662142753601\n", "Epoch 275/10000, Training Loss: 0.6529972553253174, Test Loss: 0.6328033804893494\n", "Epoch 276/10000, Training Loss: 0.651703417301178, Test Loss: 0.6312423944473267\n", "Epoch 277/10000, Training Loss: 0.6504111289978027, Test Loss: 0.629683256149292\n", "Epoch 278/10000, Training Loss: 0.6491201519966125, Test Loss: 0.6281263828277588\n", "Epoch 279/10000, Training Loss: 0.6478303074836731, Test Loss: 0.6265721917152405\n", "Epoch 280/10000, Training Loss: 0.6465418338775635, Test Loss: 0.6250201463699341\n", "Epoch 281/10000, Training Loss: 0.6452546119689941, Test Loss: 0.6234715580940247\n", "Epoch 282/10000, Training Loss: 0.6439688205718994, Test Loss: 0.6219239234924316\n", "Epoch 283/10000, Training Loss: 0.6426843404769897, Test Loss: 0.6203757524490356\n", "Epoch 284/10000, Training Loss: 0.6414011120796204, Test Loss: 0.6188308000564575\n", "Epoch 285/10000, Training Loss: 0.6401192545890808, Test Loss: 0.617289125919342\n", "Epoch 286/10000, Training Loss: 0.6388388276100159, Test Loss: 0.6157505512237549\n", "Epoch 287/10000, Training Loss: 0.6375598311424255, Test Loss: 0.6142153143882751\n", "Epoch 288/10000, Training Loss: 0.6362831592559814, Test Loss: 0.6126787066459656\n", "Epoch 289/10000, Training Loss: 0.6350085735321045, Test Loss: 0.6111412048339844\n", "Epoch 290/10000, Training Loss: 0.6337379217147827, Test Loss: 0.6096101403236389\n", "Epoch 291/10000, Training Loss: 0.6324689388275146, Test Loss: 0.6080849170684814\n", "Epoch 292/10000, Training Loss: 0.6312011480331421, Test Loss: 0.6065654754638672\n", "Epoch 293/10000, Training Loss: 0.6299350261688232, Test Loss: 0.6050483584403992\n", "Epoch 294/10000, Training Loss: 0.6286701560020447, Test Loss: 0.6035336852073669\n", "Epoch 295/10000, Training Loss: 0.6274075508117676, Test Loss: 0.602029025554657\n", "Epoch 296/10000, Training Loss: 0.6261471509933472, Test Loss: 0.6005304455757141\n", "Epoch 297/10000, Training Loss: 0.6248883605003357, Test Loss: 0.599037766456604\n", "Epoch 298/10000, Training Loss: 0.6236308217048645, Test Loss: 0.5975503921508789\n", "Epoch 299/10000, Training Loss: 0.6223758459091187, Test Loss: 0.5960667133331299\n", "Epoch 300/10000, Training Loss: 0.6211225986480713, Test Loss: 0.5945866107940674\n", "Epoch 301/10000, Training Loss: 0.6198710799217224, Test Loss: 0.5931099057197571\n", "Epoch 302/10000, Training Loss: 0.6186216473579407, Test Loss: 0.5916334390640259\n", "Epoch 303/10000, Training Loss: 0.617373526096344, Test Loss: 0.5901575088500977\n", "Epoch 304/10000, Training Loss: 0.6161271929740906, Test Loss: 0.5886821746826172\n", "Epoch 305/10000, Training Loss: 0.6148836612701416, Test Loss: 0.5872120261192322\n", "Epoch 306/10000, Training Loss: 0.6136420369148254, Test Loss: 0.5857467651367188\n", "Epoch 307/10000, Training Loss: 0.6124016642570496, Test Loss: 0.5842862129211426\n", "Epoch 308/10000, Training Loss: 0.6111629009246826, Test Loss: 0.5828300714492798\n", "Epoch 309/10000, Training Loss: 0.6099265813827515, Test Loss: 0.5813766717910767\n", "Epoch 310/10000, Training Loss: 0.6086925864219666, Test Loss: 0.5799262523651123\n", "Epoch 311/10000, Training Loss: 0.6074603199958801, Test Loss: 0.5784785747528076\n", "Epoch 312/10000, Training Loss: 0.6062300801277161, Test Loss: 0.5770307183265686\n", "Epoch 313/10000, Training Loss: 0.6050017476081848, Test Loss: 0.575583279132843\n", "Epoch 314/10000, Training Loss: 0.6037752628326416, Test Loss: 0.5741392374038696\n", "Epoch 315/10000, Training Loss: 0.6025509238243103, Test Loss: 0.5726985931396484\n", "Epoch 316/10000, Training Loss: 0.6013281941413879, Test Loss: 0.571258544921875\n", "Epoch 317/10000, Training Loss: 0.6001073122024536, Test Loss: 0.5698215961456299\n", "Epoch 318/10000, Training Loss: 0.5988884568214417, Test Loss: 0.5683884024620056\n", "Epoch 319/10000, Training Loss: 0.597671627998352, Test Loss: 0.5669561624526978\n", "Epoch 320/10000, Training Loss: 0.5964582562446594, Test Loss: 0.5655319690704346\n", "Epoch 321/10000, Training Loss: 0.5952465534210205, Test Loss: 0.5641154050827026\n", "Epoch 322/10000, Training Loss: 0.5940358638763428, Test Loss: 0.5627026557922363\n", "Epoch 323/10000, Training Loss: 0.5928272604942322, Test Loss: 0.5612927079200745\n", "Epoch 324/10000, Training Loss: 0.5916205048561096, Test Loss: 0.5598881840705872\n", "Epoch 325/10000, Training Loss: 0.5904157757759094, Test Loss: 0.5584891438484192\n", "Epoch 326/10000, Training Loss: 0.5892127752304077, Test Loss: 0.5570963025093079\n", "Epoch 327/10000, Training Loss: 0.5880115628242493, Test Loss: 0.5557084679603577\n", "Epoch 328/10000, Training Loss: 0.5868126153945923, Test Loss: 0.5543227195739746\n", "Epoch 329/10000, Training Loss: 0.5856155157089233, Test Loss: 0.5529391765594482\n", "Epoch 330/10000, Training Loss: 0.584420382976532, Test Loss: 0.5515580773353577\n", "Epoch 331/10000, Training Loss: 0.5832270383834839, Test Loss: 0.5501819849014282\n", "Epoch 332/10000, Training Loss: 0.5820358395576477, Test Loss: 0.5488106608390808\n", "Epoch 333/10000, Training Loss: 0.5808467268943787, Test Loss: 0.547444224357605\n", "Epoch 334/10000, Training Loss: 0.579659640789032, Test Loss: 0.5460787415504456\n", "Epoch 335/10000, Training Loss: 0.578474223613739, Test Loss: 0.5447152256965637\n", "Epoch 336/10000, Training Loss: 0.5772899985313416, Test Loss: 0.5433538556098938\n", "Epoch 337/10000, Training Loss: 0.576107919216156, Test Loss: 0.5419971942901611\n", "Epoch 338/10000, Training Loss: 0.5749276876449585, Test Loss: 0.540645182132721\n", "Epoch 339/10000, Training Loss: 0.573749303817749, Test Loss: 0.5392976403236389\n", "Epoch 340/10000, Training Loss: 0.5725727081298828, Test Loss: 0.5379539132118225\n", "Epoch 341/10000, Training Loss: 0.5713982582092285, Test Loss: 0.5366113185882568\n", "Epoch 342/10000, Training Loss: 0.5702248811721802, Test Loss: 0.5352705121040344\n", "Epoch 343/10000, Training Loss: 0.5690532326698303, Test Loss: 0.5339316129684448\n", "Epoch 344/10000, Training Loss: 0.5678834319114685, Test Loss: 0.5325949192047119\n", "Epoch 345/10000, Training Loss: 0.5667164325714111, Test Loss: 0.5312605500221252\n", "Epoch 346/10000, Training Loss: 0.5655515789985657, Test Loss: 0.5299310684204102\n", "Epoch 347/10000, Training Loss: 0.5643885135650635, Test Loss: 0.5286065340042114\n", "Epoch 348/10000, Training Loss: 0.5632273554801941, Test Loss: 0.5272863507270813\n", "Epoch 349/10000, Training Loss: 0.5620680451393127, Test Loss: 0.525970995426178\n", "Epoch 350/10000, Training Loss: 0.5609107613563538, Test Loss: 0.5246574878692627\n", "Epoch 351/10000, Training Loss: 0.5597597360610962, Test Loss: 0.5233437418937683\n", "Epoch 352/10000, Training Loss: 0.5586086511611938, Test Loss: 0.522030770778656\n", "Epoch 353/10000, Training Loss: 0.557458221912384, Test Loss: 0.5207200646400452\n", "Epoch 354/10000, Training Loss: 0.556312084197998, Test Loss: 0.5194106698036194\n", "Epoch 355/10000, Training Loss: 0.5551688075065613, Test Loss: 0.5181050300598145\n", "Epoch 356/10000, Training Loss: 0.554027259349823, Test Loss: 0.5168033242225647\n", "Epoch 357/10000, Training Loss: 0.552887499332428, Test Loss: 0.5155054330825806\n", "Epoch 358/10000, Training Loss: 0.5517507791519165, Test Loss: 0.5142098069190979\n", "Epoch 359/10000, Training Loss: 0.5506141185760498, Test Loss: 0.512915313243866\n", "Epoch 360/10000, Training Loss: 0.5494830012321472, Test Loss: 0.5116234421730042\n", "Epoch 361/10000, Training Loss: 0.5483532547950745, Test Loss: 0.5103347301483154\n", "Epoch 362/10000, Training Loss: 0.547224760055542, Test Loss: 0.5090467929840088\n", "Epoch 363/10000, Training Loss: 0.5460969805717468, Test Loss: 0.5077599883079529\n", "Epoch 364/10000, Training Loss: 0.5449690818786621, Test Loss: 0.5064752697944641\n", "Epoch 365/10000, Training Loss: 0.5438429117202759, Test Loss: 0.505193293094635\n", "Epoch 366/10000, Training Loss: 0.5427168011665344, Test Loss: 0.5039166808128357\n", "Epoch 367/10000, Training Loss: 0.5415902137756348, Test Loss: 0.502649188041687\n", "Epoch 368/10000, Training Loss: 0.5404620170593262, Test Loss: 0.5013856291770935\n", "Epoch 369/10000, Training Loss: 0.5393338799476624, Test Loss: 0.5001235604286194\n", "Epoch 370/10000, Training Loss: 0.5382062792778015, Test Loss: 0.4988692104816437\n", "Epoch 371/10000, Training Loss: 0.5370793342590332, Test Loss: 0.4976237416267395\n", "Epoch 372/10000, Training Loss: 0.5359531044960022, Test Loss: 0.4963802993297577\n", "Epoch 373/10000, Training Loss: 0.5348286032676697, Test Loss: 0.495138943195343\n", "Epoch 374/10000, Training Loss: 0.5337037444114685, Test Loss: 0.4938976466655731\n", "Epoch 375/10000, Training Loss: 0.5325804352760315, Test Loss: 0.4926574230194092\n", "Epoch 376/10000, Training Loss: 0.5314576625823975, Test Loss: 0.49141770601272583\n", "Epoch 377/10000, Training Loss: 0.5303353667259216, Test Loss: 0.4901772439479828\n", "Epoch 378/10000, Training Loss: 0.5292138457298279, Test Loss: 0.48894113302230835\n", "Epoch 379/10000, Training Loss: 0.528092622756958, Test Loss: 0.4877117872238159\n", "Epoch 380/10000, Training Loss: 0.5269715189933777, Test Loss: 0.48648250102996826\n", "Epoch 381/10000, Training Loss: 0.5258557796478271, Test Loss: 0.485256552696228\n", "Epoch 382/10000, Training Loss: 0.5247461795806885, Test Loss: 0.4840341806411743\n", "Epoch 383/10000, Training Loss: 0.5236427783966064, Test Loss: 0.48281702399253845\n", "Epoch 384/10000, Training Loss: 0.5225468873977661, Test Loss: 0.48160481452941895\n", "Epoch 385/10000, Training Loss: 0.5214517116546631, Test Loss: 0.4803941249847412\n", "Epoch 386/10000, Training Loss: 0.5203598737716675, Test Loss: 0.47918593883514404\n", "Epoch 387/10000, Training Loss: 0.5192699432373047, Test Loss: 0.4779793322086334\n", "Epoch 388/10000, Training Loss: 0.5181817412376404, Test Loss: 0.4767727553844452\n", "Epoch 389/10000, Training Loss: 0.5170952081680298, Test Loss: 0.4755696654319763\n", "Epoch 390/10000, Training Loss: 0.5160104036331177, Test Loss: 0.4743712246417999\n", "Epoch 391/10000, Training Loss: 0.5149250030517578, Test Loss: 0.4731786549091339\n", "Epoch 392/10000, Training Loss: 0.5138403177261353, Test Loss: 0.47198954224586487\n", "Epoch 393/10000, Training Loss: 0.5127567052841187, Test Loss: 0.4708027243614197\n", "Epoch 394/10000, Training Loss: 0.5116741061210632, Test Loss: 0.4696190357208252\n", "Epoch 395/10000, Training Loss: 0.5105960369110107, Test Loss: 0.4684390127658844\n", "Epoch 396/10000, Training Loss: 0.5095210075378418, Test Loss: 0.4672659933567047\n", "Epoch 397/10000, Training Loss: 0.5084484219551086, Test Loss: 0.46609604358673096\n", "Epoch 398/10000, Training Loss: 0.5073792934417725, Test Loss: 0.46492934226989746\n", "Epoch 399/10000, Training Loss: 0.5063145160675049, Test Loss: 0.46376535296440125\n", "Epoch 400/10000, Training Loss: 0.5052539110183716, Test Loss: 0.462607204914093\n", "Epoch 401/10000, Training Loss: 0.5041956305503845, Test Loss: 0.4614536464214325\n", "Epoch 402/10000, Training Loss: 0.5031467080116272, Test Loss: 0.4603043496608734\n", "Epoch 403/10000, Training Loss: 0.502099335193634, Test Loss: 0.4591592848300934\n", "Epoch 404/10000, Training Loss: 0.501053512096405, Test Loss: 0.45802006125450134\n", "Epoch 405/10000, Training Loss: 0.500009298324585, Test Loss: 0.456884503364563\n", "Epoch 406/10000, Training Loss: 0.4989672899246216, Test Loss: 0.45575210452079773\n", "Epoch 407/10000, Training Loss: 0.4979272484779358, Test Loss: 0.45462122559547424\n", "Epoch 408/10000, Training Loss: 0.4968882203102112, Test Loss: 0.4534943103790283\n", "Epoch 409/10000, Training Loss: 0.4958515465259552, Test Loss: 0.4523710608482361\n", "Epoch 410/10000, Training Loss: 0.49481648206710815, Test Loss: 0.4512530565261841\n", "Epoch 411/10000, Training Loss: 0.4937833547592163, Test Loss: 0.4501361548900604\n", "Epoch 412/10000, Training Loss: 0.4927517771720886, Test Loss: 0.4490213990211487\n", "Epoch 413/10000, Training Loss: 0.4917248487472534, Test Loss: 0.447909951210022\n", "Epoch 414/10000, Training Loss: 0.49070003628730774, Test Loss: 0.4468021094799042\n", "Epoch 415/10000, Training Loss: 0.48967671394348145, Test Loss: 0.4456978440284729\n", "Epoch 416/10000, Training Loss: 0.4886550009250641, Test Loss: 0.4445967972278595\n", "Epoch 417/10000, Training Loss: 0.48763492703437805, Test Loss: 0.4434988498687744\n", "Epoch 418/10000, Training Loss: 0.4866182506084442, Test Loss: 0.4424037039279938\n", "Epoch 419/10000, Training Loss: 0.4856025278568268, Test Loss: 0.4413128197193146\n", "Epoch 420/10000, Training Loss: 0.48458871245384216, Test Loss: 0.4402247667312622\n", "Epoch 421/10000, Training Loss: 0.48357903957366943, Test Loss: 0.439139723777771\n", "Epoch 422/10000, Training Loss: 0.4825708568096161, Test Loss: 0.4380575120449066\n", "Epoch 423/10000, Training Loss: 0.48156407475471497, Test Loss: 0.43697819113731384\n", "Epoch 424/10000, Training Loss: 0.4805588722229004, Test Loss: 0.43590399622917175\n", "Epoch 425/10000, Training Loss: 0.4795551598072052, Test Loss: 0.434833824634552\n", "Epoch 426/10000, Training Loss: 0.47855305671691895, Test Loss: 0.4337659776210785\n", "Epoch 427/10000, Training Loss: 0.4775531589984894, Test Loss: 0.43270015716552734\n", "Epoch 428/10000, Training Loss: 0.4765559434890747, Test Loss: 0.43163666129112244\n", "Epoch 429/10000, Training Loss: 0.47556039690971375, Test Loss: 0.43057572841644287\n", "Epoch 430/10000, Training Loss: 0.4745675027370453, Test Loss: 0.4295155704021454\n", "Epoch 431/10000, Training Loss: 0.47357696294784546, Test Loss: 0.4284563660621643\n", "Epoch 432/10000, Training Loss: 0.4725876748561859, Test Loss: 0.4273982644081116\n", "Epoch 433/10000, Training Loss: 0.47159987688064575, Test Loss: 0.4263429045677185\n", "Epoch 434/10000, Training Loss: 0.4706138074398041, Test Loss: 0.42529186606407166\n", "Epoch 435/10000, Training Loss: 0.4696311056613922, Test Loss: 0.42424359917640686\n", "Epoch 436/10000, Training Loss: 0.468649297952652, Test Loss: 0.42319828271865845\n", "Epoch 437/10000, Training Loss: 0.467669278383255, Test Loss: 0.4221560060977936\n", "Epoch 438/10000, Training Loss: 0.46669143438339233, Test Loss: 0.4211166799068451\n", "Epoch 439/10000, Training Loss: 0.46571508049964905, Test Loss: 0.42008015513420105\n", "Epoch 440/10000, Training Loss: 0.46474024653434753, Test Loss: 0.4190478026866913\n", "Epoch 441/10000, Training Loss: 0.46376708149909973, Test Loss: 0.41801953315734863\n", "Epoch 442/10000, Training Loss: 0.4627976417541504, Test Loss: 0.41699475049972534\n", "Epoch 443/10000, Training Loss: 0.46182891726493835, Test Loss: 0.4159735143184662\n", "Epoch 444/10000, Training Loss: 0.46086111664772034, Test Loss: 0.41495421528816223\n", "Epoch 445/10000, Training Loss: 0.459896445274353, Test Loss: 0.41393521428108215\n", "Epoch 446/10000, Training Loss: 0.45893406867980957, Test Loss: 0.41291654109954834\n", "Epoch 447/10000, Training Loss: 0.4579728841781616, Test Loss: 0.4118984341621399\n", "Epoch 448/10000, Training Loss: 0.45701324939727783, Test Loss: 0.4108823537826538\n", "Epoch 449/10000, Training Loss: 0.4560549557209015, Test Loss: 0.4098683297634125\n", "Epoch 450/10000, Training Loss: 0.455098420381546, Test Loss: 0.40885797142982483\n", "Epoch 451/10000, Training Loss: 0.45414525270462036, Test Loss: 0.4078514873981476\n", "Epoch 452/10000, Training Loss: 0.45319339632987976, Test Loss: 0.406848669052124\n", "Epoch 453/10000, Training Loss: 0.4522422254085541, Test Loss: 0.4058482348918915\n", "Epoch 454/10000, Training Loss: 0.451293021440506, Test Loss: 0.4048501253128052\n", "Epoch 455/10000, Training Loss: 0.4503467381000519, Test Loss: 0.40385428071022034\n", "Epoch 456/10000, Training Loss: 0.44940170645713806, Test Loss: 0.4028607904911041\n", "Epoch 457/10000, Training Loss: 0.4484582543373108, Test Loss: 0.4018707275390625\n", "Epoch 458/10000, Training Loss: 0.44751638174057007, Test Loss: 0.40088409185409546\n", "Epoch 459/10000, Training Loss: 0.4465761184692383, Test Loss: 0.39989930391311646\n", "Epoch 460/10000, Training Loss: 0.44563984870910645, Test Loss: 0.398917019367218\n", "Epoch 461/10000, Training Loss: 0.4447038769721985, Test Loss: 0.3979372978210449\n", "Epoch 462/10000, Training Loss: 0.4437682628631592, Test Loss: 0.3969600200653076\n", "Epoch 463/10000, Training Loss: 0.44283488392829895, Test Loss: 0.3959835171699524\n", "Epoch 464/10000, Training Loss: 0.44190454483032227, Test Loss: 0.39500904083251953\n", "Epoch 465/10000, Training Loss: 0.4409756064414978, Test Loss: 0.39403611421585083\n", "Epoch 466/10000, Training Loss: 0.44004806876182556, Test Loss: 0.39306652545928955\n", "Epoch 467/10000, Training Loss: 0.43912196159362793, Test Loss: 0.39210015535354614\n", "Epoch 468/10000, Training Loss: 0.43819737434387207, Test Loss: 0.39113685488700867\n", "Epoch 469/10000, Training Loss: 0.43727463483810425, Test Loss: 0.3901774287223816\n", "Epoch 470/10000, Training Loss: 0.43635401129722595, Test Loss: 0.3892216682434082\n", "Epoch 471/10000, Training Loss: 0.4354347884654999, Test Loss: 0.38826796412467957\n", "Epoch 472/10000, Training Loss: 0.43451786041259766, Test Loss: 0.38731446862220764\n", "Epoch 473/10000, Training Loss: 0.43360230326652527, Test Loss: 0.386361300945282\n", "Epoch 474/10000, Training Loss: 0.43268802762031555, Test Loss: 0.38540884852409363\n", "Epoch 475/10000, Training Loss: 0.4317759573459625, Test Loss: 0.3844590485095978\n", "Epoch 476/10000, Training Loss: 0.43086493015289307, Test Loss: 0.38351210951805115\n", "Epoch 477/10000, Training Loss: 0.42995667457580566, Test Loss: 0.38256824016571045\n", "Epoch 478/10000, Training Loss: 0.42904984951019287, Test Loss: 0.38162723183631897\n", "Epoch 479/10000, Training Loss: 0.4281443655490875, Test Loss: 0.38068887591362\n", "Epoch 480/10000, Training Loss: 0.4272402822971344, Test Loss: 0.3797532618045807\n", "Epoch 481/10000, Training Loss: 0.4263375997543335, Test Loss: 0.3788202106952667\n", "Epoch 482/10000, Training Loss: 0.425436407327652, Test Loss: 0.37788984179496765\n", "Epoch 483/10000, Training Loss: 0.4245377779006958, Test Loss: 0.3769631087779999\n", "Epoch 484/10000, Training Loss: 0.4236404299736023, Test Loss: 0.3760375678539276\n", "Epoch 485/10000, Training Loss: 0.42274412512779236, Test Loss: 0.3751123547554016\n", "Epoch 486/10000, Training Loss: 0.42185017466545105, Test Loss: 0.3741884231567383\n", "Epoch 487/10000, Training Loss: 0.42095765471458435, Test Loss: 0.37326592206954956\n", "Epoch 488/10000, Training Loss: 0.42006656527519226, Test Loss: 0.3723449409008026\n", "Epoch 489/10000, Training Loss: 0.41917693614959717, Test Loss: 0.37142664194107056\n", "Epoch 490/10000, Training Loss: 0.41828885674476624, Test Loss: 0.37051111459732056\n", "Epoch 491/10000, Training Loss: 0.417403906583786, Test Loss: 0.3695995807647705\n", "Epoch 492/10000, Training Loss: 0.4165189266204834, Test Loss: 0.36869189143180847\n", "Epoch 493/10000, Training Loss: 0.4156351387500763, Test Loss: 0.3677864968776703\n", "Epoch 494/10000, Training Loss: 0.41475406289100647, Test Loss: 0.3668823540210724\n", "Epoch 495/10000, Training Loss: 0.4138743579387665, Test Loss: 0.36597925424575806\n", "Epoch 496/10000, Training Loss: 0.4129960834980011, Test Loss: 0.3650766909122467\n", "Epoch 497/10000, Training Loss: 0.41211917996406555, Test Loss: 0.36417484283447266\n", "Epoch 498/10000, Training Loss: 0.41124382615089417, Test Loss: 0.36327558755874634\n", "Epoch 499/10000, Training Loss: 0.41036999225616455, Test Loss: 0.36237868666648865\n", "Epoch 500/10000, Training Loss: 0.4094979465007782, Test Loss: 0.36148715019226074\n", "Epoch 501/10000, Training Loss: 0.40862715244293213, Test Loss: 0.36059755086898804\n", "Epoch 502/10000, Training Loss: 0.40775811672210693, Test Loss: 0.35971006751060486\n", "Epoch 503/10000, Training Loss: 0.4068906903266907, Test Loss: 0.358823299407959\n", "Epoch 504/10000, Training Loss: 0.40602540969848633, Test Loss: 0.3579416275024414\n", "Epoch 505/10000, Training Loss: 0.4051606357097626, Test Loss: 0.35706162452697754\n", "Epoch 506/10000, Training Loss: 0.40429815649986267, Test Loss: 0.356183260679245\n", "Epoch 507/10000, Training Loss: 0.4034370183944702, Test Loss: 0.3553062677383423\n", "Epoch 508/10000, Training Loss: 0.40257737040519714, Test Loss: 0.3544318377971649\n", "Epoch 509/10000, Training Loss: 0.40171918272972107, Test Loss: 0.3535599708557129\n", "Epoch 510/10000, Training Loss: 0.4008624255657196, Test Loss: 0.3526906669139862\n", "Epoch 511/10000, Training Loss: 0.4000071883201599, Test Loss: 0.3518233299255371\n", "Epoch 512/10000, Training Loss: 0.3991544544696808, Test Loss: 0.35096216201782227\n", "Epoch 513/10000, Training Loss: 0.39830178022384644, Test Loss: 0.3501014709472656\n", "Epoch 514/10000, Training Loss: 0.39745140075683594, Test Loss: 0.34924137592315674\n", "Epoch 515/10000, Training Loss: 0.3966025114059448, Test Loss: 0.34838297963142395\n", "Epoch 516/10000, Training Loss: 0.3957550525665283, Test Loss: 0.34752535820007324\n", "Epoch 517/10000, Training Loss: 0.3949090242385864, Test Loss: 0.3466685712337494\n", "Epoch 518/10000, Training Loss: 0.3940655291080475, Test Loss: 0.34581509232521057\n", "Epoch 519/10000, Training Loss: 0.39322370290756226, Test Loss: 0.3449648916721344\n", "Epoch 520/10000, Training Loss: 0.392383337020874, Test Loss: 0.3441178500652313\n", "Epoch 521/10000, Training Loss: 0.3915444612503052, Test Loss: 0.34327375888824463\n", "Epoch 522/10000, Training Loss: 0.3907070755958557, Test Loss: 0.3424327075481415\n", "Epoch 523/10000, Training Loss: 0.3898715674877167, Test Loss: 0.34159737825393677\n", "Epoch 524/10000, Training Loss: 0.38903722167015076, Test Loss: 0.3407628536224365\n", "Epoch 525/10000, Training Loss: 0.3882046937942505, Test Loss: 0.33992892503738403\n", "Epoch 526/10000, Training Loss: 0.3873736262321472, Test Loss: 0.33909669518470764\n", "Epoch 527/10000, Training Loss: 0.3865441679954529, Test Loss: 0.33826619386672974\n", "Epoch 528/10000, Training Loss: 0.3857162296772003, Test Loss: 0.3374366760253906\n", "Epoch 529/10000, Training Loss: 0.3848898708820343, Test Loss: 0.33660924434661865\n", "Epoch 530/10000, Training Loss: 0.38406500220298767, Test Loss: 0.3357827365398407\n", "Epoch 531/10000, Training Loss: 0.38324174284935, Test Loss: 0.33495792746543884\n", "Epoch 532/10000, Training Loss: 0.38242003321647644, Test Loss: 0.3341336250305176\n", "Epoch 533/10000, Training Loss: 0.3815998435020447, Test Loss: 0.3333112597465515\n", "Epoch 534/10000, Training Loss: 0.38078126311302185, Test Loss: 0.332491397857666\n", "Epoch 535/10000, Training Loss: 0.3799642026424408, Test Loss: 0.33167311549186707\n", "Epoch 536/10000, Training Loss: 0.3791487514972687, Test Loss: 0.33085596561431885\n", "Epoch 537/10000, Training Loss: 0.3783349394798279, Test Loss: 0.33004119992256165\n", "Epoch 538/10000, Training Loss: 0.37752267718315125, Test Loss: 0.3292292654514313\n", "Epoch 539/10000, Training Loss: 0.37671202421188354, Test Loss: 0.3284202516078949\n", "Epoch 540/10000, Training Loss: 0.3759029805660248, Test Loss: 0.327618271112442\n", "Epoch 541/10000, Training Loss: 0.37509533762931824, Test Loss: 0.3268178701400757\n", "Epoch 542/10000, Training Loss: 0.3742893934249878, Test Loss: 0.32601937651634216\n", "Epoch 543/10000, Training Loss: 0.37348508834838867, Test Loss: 0.32522210478782654\n", "Epoch 544/10000, Training Loss: 0.3726823329925537, Test Loss: 0.3244257867336273\n", "Epoch 545/10000, Training Loss: 0.3718810975551605, Test Loss: 0.3236308693885803\n", "Epoch 546/10000, Training Loss: 0.37108150124549866, Test Loss: 0.3228374421596527\n", "Epoch 547/10000, Training Loss: 0.3702833950519562, Test Loss: 0.3220439553260803\n", "Epoch 548/10000, Training Loss: 0.36948686838150024, Test Loss: 0.32125115394592285\n", "Epoch 549/10000, Training Loss: 0.3686918020248413, Test Loss: 0.32045960426330566\n", "Epoch 550/10000, Training Loss: 0.36789843440055847, Test Loss: 0.3196693956851959\n", "Epoch 551/10000, Training Loss: 0.36710652709007263, Test Loss: 0.31888115406036377\n", "Epoch 552/10000, Training Loss: 0.3663162887096405, Test Loss: 0.31809505820274353\n", "Epoch 553/10000, Training Loss: 0.36552754044532776, Test Loss: 0.3173111081123352\n", "Epoch 554/10000, Training Loss: 0.3647404611110687, Test Loss: 0.3165287375450134\n", "Epoch 555/10000, Training Loss: 0.36395516991615295, Test Loss: 0.315748929977417\n", "Epoch 556/10000, Training Loss: 0.3631715476512909, Test Loss: 0.3149721920490265\n", "Epoch 557/10000, Training Loss: 0.36238938570022583, Test Loss: 0.3141976296901703\n", "Epoch 558/10000, Training Loss: 0.3616088330745697, Test Loss: 0.3134261965751648\n", "Epoch 559/10000, Training Loss: 0.36082974076271057, Test Loss: 0.31265783309936523\n", "Epoch 560/10000, Training Loss: 0.36005228757858276, Test Loss: 0.31189200282096863\n", "Epoch 561/10000, Training Loss: 0.35927629470825195, Test Loss: 0.3111286461353302\n", "Epoch 562/10000, Training Loss: 0.3585018515586853, Test Loss: 0.3103681802749634\n", "Epoch 563/10000, Training Loss: 0.3577289581298828, Test Loss: 0.30961036682128906\n", "Epoch 564/10000, Training Loss: 0.3569576144218445, Test Loss: 0.30885422229766846\n", "Epoch 565/10000, Training Loss: 0.3561877906322479, Test Loss: 0.30810052156448364\n", "Epoch 566/10000, Training Loss: 0.3554195463657379, Test Loss: 0.30734914541244507\n", "Epoch 567/10000, Training Loss: 0.35465285181999207, Test Loss: 0.30659982562065125\n", "Epoch 568/10000, Training Loss: 0.3538878560066223, Test Loss: 0.3058515787124634\n", "Epoch 569/10000, Training Loss: 0.35312438011169434, Test Loss: 0.3051042854785919\n", "Epoch 570/10000, Training Loss: 0.3523624539375305, Test Loss: 0.30435848236083984\n", "Epoch 571/10000, Training Loss: 0.35160210728645325, Test Loss: 0.3036133646965027\n", "Epoch 572/10000, Training Loss: 0.350843220949173, Test Loss: 0.3028695285320282\n", "Epoch 573/10000, Training Loss: 0.35008594393730164, Test Loss: 0.30212682485580444\n", "Epoch 574/10000, Training Loss: 0.34933024644851685, Test Loss: 0.30138570070266724\n", "Epoch 575/10000, Training Loss: 0.348576158285141, Test Loss: 0.3006461262702942\n", "Epoch 576/10000, Training Loss: 0.3478235602378845, Test Loss: 0.2999081313610077\n", "Epoch 577/10000, Training Loss: 0.34707263112068176, Test Loss: 0.2991707921028137\n", "Epoch 578/10000, Training Loss: 0.34632328152656555, Test Loss: 0.2984342575073242\n", "Epoch 579/10000, Training Loss: 0.3455754220485687, Test Loss: 0.2976987361907959\n", "Epoch 580/10000, Training Loss: 0.34482914209365845, Test Loss: 0.29696470499038696\n", "Epoch 581/10000, Training Loss: 0.34408438205718994, Test Loss: 0.296232134103775\n", "Epoch 582/10000, Training Loss: 0.3433411419391632, Test Loss: 0.295501708984375\n", "Epoch 583/10000, Training Loss: 0.3425995707511902, Test Loss: 0.29477331042289734\n", "Epoch 584/10000, Training Loss: 0.34185951948165894, Test Loss: 0.294047087430954\n", "Epoch 585/10000, Training Loss: 0.3411209285259247, Test Loss: 0.2933230400085449\n", "Epoch 586/10000, Training Loss: 0.34038400650024414, Test Loss: 0.29260072112083435\n", "Epoch 587/10000, Training Loss: 0.3396485447883606, Test Loss: 0.2918802499771118\n", "Epoch 588/10000, Training Loss: 0.338914692401886, Test Loss: 0.29116290807724\n", "Epoch 589/10000, Training Loss: 0.33818233013153076, Test Loss: 0.2904478907585144\n", "Epoch 590/10000, Training Loss: 0.3374515771865845, Test Loss: 0.28973543643951416\n", "Epoch 591/10000, Training Loss: 0.33672240376472473, Test Loss: 0.2890271246433258\n", "Epoch 592/10000, Training Loss: 0.33599498867988586, Test Loss: 0.2883211374282837\n", "Epoch 593/10000, Training Loss: 0.3352690637111664, Test Loss: 0.2876173257827759\n", "Epoch 594/10000, Training Loss: 0.33454468846321106, Test Loss: 0.2869156301021576\n", "Epoch 595/10000, Training Loss: 0.3338218331336975, Test Loss: 0.28621596097946167\n", "Epoch 596/10000, Training Loss: 0.3331005275249481, Test Loss: 0.2855179011821747\n", "Epoch 597/10000, Training Loss: 0.33238086104393005, Test Loss: 0.2848213315010071\n", "Epoch 598/10000, Training Loss: 0.33166268467903137, Test Loss: 0.2841262221336365\n", "Epoch 599/10000, Training Loss: 0.3309461176395416, Test Loss: 0.2834329605102539\n", "Epoch 600/10000, Training Loss: 0.33023110032081604, Test Loss: 0.28274139761924744\n", "Epoch 601/10000, Training Loss: 0.329517662525177, Test Loss: 0.28205162286758423\n", "Epoch 602/10000, Training Loss: 0.32880568504333496, Test Loss: 0.28136348724365234\n", "Epoch 603/10000, Training Loss: 0.3280952572822571, Test Loss: 0.2806771397590637\n", "Epoch 604/10000, Training Loss: 0.3273864686489105, Test Loss: 0.27999213337898254\n", "Epoch 605/10000, Training Loss: 0.32667919993400574, Test Loss: 0.2793087959289551\n", "Epoch 606/10000, Training Loss: 0.3259734809398651, Test Loss: 0.2786269783973694\n", "Epoch 607/10000, Training Loss: 0.3252692222595215, Test Loss: 0.2779470682144165\n", "Epoch 608/10000, Training Loss: 0.3245670199394226, Test Loss: 0.27726975083351135\n", "Epoch 609/10000, Training Loss: 0.3238668739795685, Test Loss: 0.2765953242778778\n", "Epoch 610/10000, Training Loss: 0.32316821813583374, Test Loss: 0.2759243845939636\n", "Epoch 611/10000, Training Loss: 0.3224731981754303, Test Loss: 0.2752438485622406\n", "Epoch 612/10000, Training Loss: 0.3217789828777313, Test Loss: 0.2745554745197296\n", "Epoch 613/10000, Training Loss: 0.3210847079753876, Test Loss: 0.27386170625686646\n", "Epoch 614/10000, Training Loss: 0.3203929364681244, Test Loss: 0.2731764018535614\n", "Epoch 615/10000, Training Loss: 0.3197041153907776, Test Loss: 0.2724999189376831\n", "Epoch 616/10000, Training Loss: 0.3190164566040039, Test Loss: 0.27183228731155396\n", "Epoch 617/10000, Training Loss: 0.3183298408985138, Test Loss: 0.27117303013801575\n", "Epoch 618/10000, Training Loss: 0.3176441788673401, Test Loss: 0.2705214321613312\n", "Epoch 619/10000, Training Loss: 0.3169596195220947, Test Loss: 0.26987653970718384\n", "Epoch 620/10000, Training Loss: 0.31627947092056274, Test Loss: 0.26922550797462463\n", "Epoch 621/10000, Training Loss: 0.31560009717941284, Test Loss: 0.26856857538223267\n", "Epoch 622/10000, Training Loss: 0.3149210512638092, Test Loss: 0.2679066061973572\n", "Epoch 623/10000, Training Loss: 0.31424275040626526, Test Loss: 0.2672407925128937\n", "Epoch 624/10000, Training Loss: 0.313566654920578, Test Loss: 0.2665841579437256\n", "Epoch 625/10000, Training Loss: 0.31289368867874146, Test Loss: 0.2659365236759186\n", "Epoch 626/10000, Training Loss: 0.31222155690193176, Test Loss: 0.2652970552444458\n", "Epoch 627/10000, Training Loss: 0.311550498008728, Test Loss: 0.2646649479866028\n", "Epoch 628/10000, Training Loss: 0.3108806014060974, Test Loss: 0.26403892040252686\n", "Epoch 629/10000, Training Loss: 0.3102138042449951, Test Loss: 0.263405442237854\n", "Epoch 630/10000, Training Loss: 0.3095485270023346, Test Loss: 0.26276475191116333\n", "Epoch 631/10000, Training Loss: 0.30888375639915466, Test Loss: 0.26211777329444885\n", "Epoch 632/10000, Training Loss: 0.30822092294692993, Test Loss: 0.2614774703979492\n", "Epoch 633/10000, Training Loss: 0.30756092071533203, Test Loss: 0.2608436346054077\n", "Epoch 634/10000, Training Loss: 0.3069020211696625, Test Loss: 0.2602159380912781\n", "Epoch 635/10000, Training Loss: 0.3062441647052765, Test Loss: 0.2595937252044678\n", "Epoch 636/10000, Training Loss: 0.3055875301361084, Test Loss: 0.2589762210845947\n", "Epoch 637/10000, Training Loss: 0.304934561252594, Test Loss: 0.2583502531051636\n", "Epoch 638/10000, Training Loss: 0.3042820692062378, Test Loss: 0.2577167749404907\n", "Epoch 639/10000, Training Loss: 0.30363035202026367, Test Loss: 0.2570780813694\n", "Epoch 640/10000, Training Loss: 0.3029802143573761, Test Loss: 0.2564481794834137\n", "Epoch 641/10000, Training Loss: 0.3023330569267273, Test Loss: 0.2558271884918213\n", "Epoch 642/10000, Training Loss: 0.3016868531703949, Test Loss: 0.25521448254585266\n", "Epoch 643/10000, Training Loss: 0.30104169249534607, Test Loss: 0.2546093165874481\n", "Epoch 644/10000, Training Loss: 0.30039986968040466, Test Loss: 0.25399795174598694\n", "Epoch 645/10000, Training Loss: 0.2997589409351349, Test Loss: 0.25338077545166016\n", "Epoch 646/10000, Training Loss: 0.29911887645721436, Test Loss: 0.2527589201927185\n", "Epoch 647/10000, Training Loss: 0.298479825258255, Test Loss: 0.25213390588760376\n", "Epoch 648/10000, Training Loss: 0.2978444993495941, Test Loss: 0.2515198886394501\n", "Epoch 649/10000, Training Loss: 0.29721030592918396, Test Loss: 0.25091642141342163\n", "Epoch 650/10000, Training Loss: 0.2965768873691559, Test Loss: 0.250322550535202\n", "Epoch 651/10000, Training Loss: 0.29594433307647705, Test Loss: 0.24973724782466888\n", "Epoch 652/10000, Training Loss: 0.2953139841556549, Test Loss: 0.24914520978927612\n", "Epoch 653/10000, Training Loss: 0.29468634724617004, Test Loss: 0.24854649603366852\n", "Epoch 654/10000, Training Loss: 0.29405951499938965, Test Loss: 0.2479419708251953\n", "Epoch 655/10000, Training Loss: 0.29343369603157043, Test Loss: 0.24733296036720276\n", "Epoch 656/10000, Training Loss: 0.29281046986579895, Test Loss: 0.2467341423034668\n", "Epoch 657/10000, Training Loss: 0.29218894243240356, Test Loss: 0.2461458295583725\n", "Epoch 658/10000, Training Loss: 0.29156818985939026, Test Loss: 0.245567187666893\n", "Epoch 659/10000, Training Loss: 0.29094910621643066, Test Loss: 0.24498362839221954\n", "Epoch 660/10000, Training Loss: 0.2903323471546173, Test Loss: 0.24439559876918793\n", "Epoch 661/10000, Training Loss: 0.2897167205810547, Test Loss: 0.24380375444889069\n", "Epoch 662/10000, Training Loss: 0.28910231590270996, Test Loss: 0.24320925772190094\n", "Epoch 663/10000, Training Loss: 0.28849121928215027, Test Loss: 0.24262665212154388\n", "Epoch 664/10000, Training Loss: 0.28788071870803833, Test Loss: 0.24205507338047028\n", "Epoch 665/10000, Training Loss: 0.2872708737850189, Test Loss: 0.24149398505687714\n", "Epoch 666/10000, Training Loss: 0.2866631746292114, Test Loss: 0.24092811346054077\n", "Epoch 667/10000, Training Loss: 0.2860579192638397, Test Loss: 0.24035504460334778\n", "Epoch 668/10000, Training Loss: 0.2854536473751068, Test Loss: 0.23977579176425934\n", "Epoch 669/10000, Training Loss: 0.2848506271839142, Test Loss: 0.23919253051280975\n", "Epoch 670/10000, Training Loss: 0.284248948097229, Test Loss: 0.23860689997673035\n", "Epoch 671/10000, Training Loss: 0.28365078568458557, Test Loss: 0.2380361706018448\n", "Epoch 672/10000, Training Loss: 0.28305330872535706, Test Loss: 0.23747998476028442\n", "Epoch 673/10000, Training Loss: 0.28245630860328674, Test Loss: 0.23693667352199554\n", "Epoch 674/10000, Training Loss: 0.2818610370159149, Test Loss: 0.23638825118541718\n", "Epoch 675/10000, Training Loss: 0.28126853704452515, Test Loss: 0.23583365976810455\n", "Epoch 676/10000, Training Loss: 0.28067710995674133, Test Loss: 0.23527351021766663\n", "Epoch 677/10000, Training Loss: 0.2800869643688202, Test Loss: 0.2347082495689392\n", "Epoch 678/10000, Training Loss: 0.27949830889701843, Test Loss: 0.23414035141468048\n", "Epoch 679/10000, Training Loss: 0.2789112627506256, Test Loss: 0.23358723521232605\n", "Epoch 680/10000, Training Loss: 0.2783261239528656, Test Loss: 0.23304781317710876\n", "Epoch 681/10000, Training Loss: 0.2777429521083832, Test Loss: 0.23250429332256317\n", "Epoch 682/10000, Training Loss: 0.2771611511707306, Test Loss: 0.23195694386959076\n", "Epoch 683/10000, Training Loss: 0.27658072113990784, Test Loss: 0.23140615224838257\n", "Epoch 684/10000, Training Loss: 0.2760016620159149, Test Loss: 0.23085302114486694\n", "Epoch 685/10000, Training Loss: 0.2754240334033966, Test Loss: 0.2302984893321991\n", "Epoch 686/10000, Training Loss: 0.2748490571975708, Test Loss: 0.22975891828536987\n", "Epoch 687/10000, Training Loss: 0.2742747664451599, Test Loss: 0.2292330116033554\n", "Epoch 688/10000, Training Loss: 0.2737019658088684, Test Loss: 0.22870267927646637\n", "Epoch 689/10000, Training Loss: 0.27313122153282166, Test Loss: 0.22816894948482513\n", "Epoch 690/10000, Training Loss: 0.2725619375705719, Test Loss: 0.2276320606470108\n", "Epoch 691/10000, Training Loss: 0.27199408411979675, Test Loss: 0.22709278762340546\n", "Epoch 692/10000, Training Loss: 0.27142763137817383, Test Loss: 0.22655275464057922\n", "Epoch 693/10000, Training Loss: 0.2708626985549927, Test Loss: 0.2260124683380127\n", "Epoch 694/10000, Training Loss: 0.2702997624874115, Test Loss: 0.22548902034759521\n", "Epoch 695/10000, Training Loss: 0.2697378396987915, Test Loss: 0.22496461868286133\n", "Epoch 696/10000, Training Loss: 0.26917779445648193, Test Loss: 0.22443939745426178\n", "Epoch 697/10000, Training Loss: 0.26861923933029175, Test Loss: 0.22393083572387695\n", "Epoch 698/10000, Training Loss: 0.2680625319480896, Test Loss: 0.22341875731945038\n", "Epoch 699/10000, Training Loss: 0.2675074338912964, Test Loss: 0.22290243208408356\n", "Epoch 700/10000, Training Loss: 0.2669537365436554, Test Loss: 0.22238270938396454\n", "Epoch 701/10000, Training Loss: 0.26640135049819946, Test Loss: 0.22186024487018585\n", "Epoch 702/10000, Training Loss: 0.2658505141735077, Test Loss: 0.22133702039718628\n", "Epoch 703/10000, Training Loss: 0.2653011679649353, Test Loss: 0.22081410884857178\n", "Epoch 704/10000, Training Loss: 0.26475340127944946, Test Loss: 0.22029195725917816\n", "Epoch 705/10000, Training Loss: 0.2642076313495636, Test Loss: 0.21978971362113953\n", "Epoch 706/10000, Training Loss: 0.2636626958847046, Test Loss: 0.21928708255290985\n", "Epoch 707/10000, Training Loss: 0.2631196975708008, Test Loss: 0.21878387033939362\n", "Epoch 708/10000, Training Loss: 0.2625781297683716, Test Loss: 0.21828007698059082\n", "Epoch 709/10000, Training Loss: 0.26203808188438416, Test Loss: 0.21777498722076416\n", "Epoch 710/10000, Training Loss: 0.26149967312812805, Test Loss: 0.2172694355249405\n", "Epoch 711/10000, Training Loss: 0.26096266508102417, Test Loss: 0.21676398813724518\n", "Epoch 712/10000, Training Loss: 0.26042723655700684, Test Loss: 0.21626022458076477\n", "Epoch 713/10000, Training Loss: 0.25989338755607605, Test Loss: 0.21577633917331696\n", "Epoch 714/10000, Training Loss: 0.25936099886894226, Test Loss: 0.215291365981102\n", "Epoch 715/10000, Training Loss: 0.25883015990257263, Test Loss: 0.21480487287044525\n", "Epoch 716/10000, Training Loss: 0.25830093026161194, Test Loss: 0.21431602537631989\n", "Epoch 717/10000, Training Loss: 0.25777310132980347, Test Loss: 0.21382519602775574\n", "Epoch 718/10000, Training Loss: 0.257246732711792, Test Loss: 0.21333223581314087\n", "Epoch 719/10000, Training Loss: 0.2567218840122223, Test Loss: 0.21283899247646332\n", "Epoch 720/10000, Training Loss: 0.25619858503341675, Test Loss: 0.21234667301177979\n", "Epoch 721/10000, Training Loss: 0.2556767165660858, Test Loss: 0.2118554562330246\n", "Epoch 722/10000, Training Loss: 0.25515633821487427, Test Loss: 0.2113664299249649\n", "Epoch 723/10000, Training Loss: 0.2546374797821045, Test Loss: 0.21087969839572906\n", "Epoch 724/10000, Training Loss: 0.2541200816631317, Test Loss: 0.21039558947086334\n", "Epoch 725/10000, Training Loss: 0.2536041736602783, Test Loss: 0.20991487801074982\n", "Epoch 726/10000, Training Loss: 0.25308987498283386, Test Loss: 0.20943672955036163\n", "Epoch 727/10000, Training Loss: 0.2525770664215088, Test Loss: 0.20896129310131073\n", "Epoch 728/10000, Training Loss: 0.2520657181739807, Test Loss: 0.2084881216287613\n", "Epoch 729/10000, Training Loss: 0.25155577063560486, Test Loss: 0.2080170214176178\n", "Epoch 730/10000, Training Loss: 0.2510472238063812, Test Loss: 0.20754802227020264\n", "Epoch 731/10000, Training Loss: 0.2505401372909546, Test Loss: 0.20708192884922028\n", "Epoch 732/10000, Training Loss: 0.2500346899032593, Test Loss: 0.20661891996860504\n", "Epoch 733/10000, Training Loss: 0.2495306432247162, Test Loss: 0.20615853369235992\n", "Epoch 734/10000, Training Loss: 0.2490280121564865, Test Loss: 0.20570023357868195\n", "Epoch 735/10000, Training Loss: 0.24852684140205383, Test Loss: 0.20524351298809052\n", "Epoch 736/10000, Training Loss: 0.24802714586257935, Test Loss: 0.2047865241765976\n", "Epoch 737/10000, Training Loss: 0.24752897024154663, Test Loss: 0.2043289691209793\n", "Epoch 738/10000, Training Loss: 0.24703219532966614, Test Loss: 0.203871950507164\n", "Epoch 739/10000, Training Loss: 0.24653682112693787, Test Loss: 0.20341551303863525\n", "Epoch 740/10000, Training Loss: 0.24604296684265137, Test Loss: 0.2029593139886856\n", "Epoch 741/10000, Training Loss: 0.2455504834651947, Test Loss: 0.20250466465950012\n", "Epoch 742/10000, Training Loss: 0.24505949020385742, Test Loss: 0.20205168426036835\n", "Epoch 743/10000, Training Loss: 0.24456988275051117, Test Loss: 0.20159965753555298\n", "Epoch 744/10000, Training Loss: 0.24408172070980072, Test Loss: 0.20114831626415253\n", "Epoch 745/10000, Training Loss: 0.2435949742794037, Test Loss: 0.20069915056228638\n", "Epoch 746/10000, Training Loss: 0.24310968816280365, Test Loss: 0.2002522498369217\n", "Epoch 747/10000, Training Loss: 0.24262575805187225, Test Loss: 0.19980761408805847\n", "Epoch 748/10000, Training Loss: 0.24214327335357666, Test Loss: 0.1993657499551773\n", "Epoch 749/10000, Training Loss: 0.24166224896907806, Test Loss: 0.1989254653453827\n", "Epoch 750/10000, Training Loss: 0.24118265509605408, Test Loss: 0.19848662614822388\n", "Epoch 751/10000, Training Loss: 0.24070440232753754, Test Loss: 0.1980494111776352\n", "Epoch 752/10000, Training Loss: 0.2402275800704956, Test Loss: 0.1976148933172226\n", "Epoch 753/10000, Training Loss: 0.23975220322608948, Test Loss: 0.19718259572982788\n", "Epoch 754/10000, Training Loss: 0.2392781376838684, Test Loss: 0.1967524141073227\n", "Epoch 755/10000, Training Loss: 0.23880544304847717, Test Loss: 0.19632406532764435\n", "Epoch 756/10000, Training Loss: 0.23833414912223816, Test Loss: 0.1958964467048645\n", "Epoch 757/10000, Training Loss: 0.23786427080631256, Test Loss: 0.19546984136104584\n", "Epoch 758/10000, Training Loss: 0.2373957335948944, Test Loss: 0.19504529237747192\n", "Epoch 759/10000, Training Loss: 0.23692864179611206, Test Loss: 0.19462266564369202\n", "Epoch 760/10000, Training Loss: 0.23646286129951477, Test Loss: 0.1942005753517151\n", "Epoch 761/10000, Training Loss: 0.23599840700626373, Test Loss: 0.19378001987934113\n", "Epoch 762/10000, Training Loss: 0.2355353832244873, Test Loss: 0.19335950911045074\n", "Epoch 763/10000, Training Loss: 0.2350737303495407, Test Loss: 0.19294020533561707\n", "Epoch 764/10000, Training Loss: 0.23461337387561798, Test Loss: 0.1925223469734192\n", "Epoch 765/10000, Training Loss: 0.2341543287038803, Test Loss: 0.19210627675056458\n", "Epoch 766/10000, Training Loss: 0.2336968034505844, Test Loss: 0.19169074296951294\n", "Epoch 767/10000, Training Loss: 0.2332405298948288, Test Loss: 0.19127590954303741\n", "Epoch 768/10000, Training Loss: 0.23278552293777466, Test Loss: 0.19086311757564545\n", "Epoch 769/10000, Training Loss: 0.23233193159103394, Test Loss: 0.1904521882534027\n", "Epoch 770/10000, Training Loss: 0.23187965154647827, Test Loss: 0.1900431513786316\n", "Epoch 771/10000, Training Loss: 0.23142872750759125, Test Loss: 0.18963493406772614\n", "Epoch 772/10000, Training Loss: 0.2309790998697281, Test Loss: 0.18922771513462067\n", "Epoch 773/10000, Training Loss: 0.23053085803985596, Test Loss: 0.1888227015733719\n", "Epoch 774/10000, Training Loss: 0.23008385300636292, Test Loss: 0.18841975927352905\n", "Epoch 775/10000, Training Loss: 0.2296382188796997, Test Loss: 0.18801888823509216\n", "Epoch 776/10000, Training Loss: 0.22919388115406036, Test Loss: 0.18761898577213287\n", "Epoch 777/10000, Training Loss: 0.22875083982944489, Test Loss: 0.1872200071811676\n", "Epoch 778/10000, Training Loss: 0.22830933332443237, Test Loss: 0.186824768781662\n", "Epoch 779/10000, Training Loss: 0.22786888480186462, Test Loss: 0.1864316165447235\n", "Epoch 780/10000, Training Loss: 0.2274298369884491, Test Loss: 0.1860380619764328\n", "Epoch 781/10000, Training Loss: 0.22699205577373505, Test Loss: 0.18564429879188538\n", "Epoch 782/10000, Training Loss: 0.22655566036701202, Test Loss: 0.18525055050849915\n", "Epoch 783/10000, Training Loss: 0.2261204719543457, Test Loss: 0.18485693633556366\n", "Epoch 784/10000, Training Loss: 0.2256866693496704, Test Loss: 0.18446752429008484\n", "Epoch 785/10000, Training Loss: 0.2252541333436966, Test Loss: 0.1840810328722\n", "Epoch 786/10000, Training Loss: 0.22482281923294067, Test Loss: 0.18369722366333008\n", "Epoch 787/10000, Training Loss: 0.22439272701740265, Test Loss: 0.18331386148929596\n", "Epoch 788/10000, Training Loss: 0.22396473586559296, Test Loss: 0.18291330337524414\n", "Epoch 789/10000, Training Loss: 0.22353795170783997, Test Loss: 0.18249870836734772\n", "Epoch 790/10000, Training Loss: 0.22311188280582428, Test Loss: 0.18207450211048126\n", "Epoch 791/10000, Training Loss: 0.22268715500831604, Test Loss: 0.18166503310203552\n", "Epoch 792/10000, Training Loss: 0.22226451337337494, Test Loss: 0.18127106130123138\n", "Epoch 793/10000, Training Loss: 0.22184288501739502, Test Loss: 0.18089063465595245\n", "Epoch 794/10000, Training Loss: 0.22142252326011658, Test Loss: 0.18050545454025269\n", "Epoch 795/10000, Training Loss: 0.22100362181663513, Test Loss: 0.1801166832447052\n", "Epoch 796/10000, Training Loss: 0.22058583796024323, Test Loss: 0.179725781083107\n", "Epoch 797/10000, Training Loss: 0.2201692759990692, Test Loss: 0.17933417856693268\n", "Epoch 798/10000, Training Loss: 0.2197544127702713, Test Loss: 0.17896084487438202\n", "Epoch 799/10000, Training Loss: 0.2193402647972107, Test Loss: 0.17860408127307892\n", "Epoch 800/10000, Training Loss: 0.2189272791147232, Test Loss: 0.17824363708496094\n", "Epoch 801/10000, Training Loss: 0.21851585805416107, Test Loss: 0.17787884175777435\n", "Epoch 802/10000, Training Loss: 0.21810561418533325, Test Loss: 0.1775096207857132\n", "Epoch 803/10000, Training Loss: 0.21769653260707855, Test Loss: 0.17713600397109985\n", "Epoch 804/10000, Training Loss: 0.21728868782520294, Test Loss: 0.17676104605197906\n", "Epoch 805/10000, Training Loss: 0.21688209474086761, Test Loss: 0.1764029711484909\n", "Epoch 806/10000, Training Loss: 0.21647679805755615, Test Loss: 0.1760573387145996\n", "Epoch 807/10000, Training Loss: 0.21607279777526855, Test Loss: 0.1757040023803711\n", "Epoch 808/10000, Training Loss: 0.21567019820213318, Test Loss: 0.17534276843070984\n", "Epoch 809/10000, Training Loss: 0.2152685821056366, Test Loss: 0.1749742478132248\n", "Epoch 810/10000, Training Loss: 0.21486832201480865, Test Loss: 0.17461718618869781\n", "Epoch 811/10000, Training Loss: 0.21446926891803741, Test Loss: 0.17427025735378265\n", "Epoch 812/10000, Training Loss: 0.21407145261764526, Test Loss: 0.17391446232795715\n", "Epoch 813/10000, Training Loss: 0.21367484331130981, Test Loss: 0.17355068027973175\n", "Epoch 814/10000, Training Loss: 0.2132793664932251, Test Loss: 0.17318053543567657\n", "Epoch 815/10000, Training Loss: 0.21288558840751648, Test Loss: 0.172825425863266\n", "Epoch 816/10000, Training Loss: 0.2124926596879959, Test Loss: 0.17248454689979553\n", "Epoch 817/10000, Training Loss: 0.212100550532341, Test Loss: 0.17215333878993988\n", "Epoch 818/10000, Training Loss: 0.21171045303344727, Test Loss: 0.1718117892742157\n", "Epoch 819/10000, Training Loss: 0.2113213837146759, Test Loss: 0.17146018147468567\n", "Epoch 820/10000, Training Loss: 0.21093325316905975, Test Loss: 0.17110005021095276\n", "Epoch 821/10000, Training Loss: 0.21054615080356598, Test Loss: 0.17073358595371246\n", "Epoch 822/10000, Training Loss: 0.21016010642051697, Test Loss: 0.17036356031894684\n", "Epoch 823/10000, Training Loss: 0.2097756266593933, Test Loss: 0.17000962793827057\n", "Epoch 824/10000, Training Loss: 0.20939244329929352, Test Loss: 0.16967353224754333\n", "Epoch 825/10000, Training Loss: 0.2090100347995758, Test Loss: 0.16933654248714447\n", "Epoch 826/10000, Training Loss: 0.20862910151481628, Test Loss: 0.16899624466896057\n", "Epoch 827/10000, Training Loss: 0.20824937522411346, Test Loss: 0.16865335404872894\n", "Epoch 828/10000, Training Loss: 0.20787076652050018, Test Loss: 0.16830860078334808\n", "Epoch 829/10000, Training Loss: 0.20749324560165405, Test Loss: 0.16796298325061798\n", "Epoch 830/10000, Training Loss: 0.20711694657802582, Test Loss: 0.16761726140975952\n", "Epoch 831/10000, Training Loss: 0.20674218237400055, Test Loss: 0.16728927195072174\n", "Epoch 832/10000, Training Loss: 0.20636792480945587, Test Loss: 0.16697682440280914\n", "Epoch 833/10000, Training Loss: 0.20599520206451416, Test Loss: 0.16665983200073242\n", "Epoch 834/10000, Training Loss: 0.20562370121479034, Test Loss: 0.16633738577365875\n", "Epoch 835/10000, Training Loss: 0.2052532583475113, Test Loss: 0.16600936651229858\n", "Epoch 836/10000, Training Loss: 0.20488393306732178, Test Loss: 0.1656763106584549\n", "Epoch 837/10000, Training Loss: 0.20451568067073822, Test Loss: 0.1653391420841217\n", "Epoch 838/10000, Training Loss: 0.20414848625659943, Test Loss: 0.16499938070774078\n", "Epoch 839/10000, Training Loss: 0.20378243923187256, Test Loss: 0.16465900838375092\n", "Epoch 840/10000, Training Loss: 0.2034176141023636, Test Loss: 0.16431966423988342\n", "Epoch 841/10000, Training Loss: 0.20305386185646057, Test Loss: 0.16398301720619202\n", "Epoch 842/10000, Training Loss: 0.20269137620925903, Test Loss: 0.16365277767181396\n", "Epoch 843/10000, Training Loss: 0.20232993364334106, Test Loss: 0.1633269339799881\n", "Epoch 844/10000, Training Loss: 0.20196963846683502, Test Loss: 0.16300566494464874\n", "Epoch 845/10000, Training Loss: 0.20161041617393494, Test Loss: 0.1626884937286377\n", "Epoch 846/10000, Training Loss: 0.20125232636928558, Test Loss: 0.16237491369247437\n", "Epoch 847/10000, Training Loss: 0.2008953094482422, Test Loss: 0.16206425428390503\n", "Epoch 848/10000, Training Loss: 0.20053941011428833, Test Loss: 0.1617557257413864\n", "Epoch 849/10000, Training Loss: 0.200184628367424, Test Loss: 0.16144853830337524\n", "Epoch 850/10000, Training Loss: 0.19983099400997162, Test Loss: 0.16114187240600586\n", "Epoch 851/10000, Training Loss: 0.1994784027338028, Test Loss: 0.1608351767063141\n", "Epoch 852/10000, Training Loss: 0.19912689924240112, Test Loss: 0.1605280488729477\n", "Epoch 853/10000, Training Loss: 0.19877654314041138, Test Loss: 0.160220205783844\n", "Epoch 854/10000, Training Loss: 0.1984272003173828, Test Loss: 0.15991172194480896\n", "Epoch 855/10000, Training Loss: 0.1980789750814438, Test Loss: 0.15960276126861572\n", "Epoch 856/10000, Training Loss: 0.1977318525314331, Test Loss: 0.15929344296455383\n", "Epoch 857/10000, Training Loss: 0.197385773062706, Test Loss: 0.1589844822883606\n", "Epoch 858/10000, Training Loss: 0.19704073667526245, Test Loss: 0.15867634117603302\n", "Epoch 859/10000, Training Loss: 0.19669677317142487, Test Loss: 0.1583695113658905\n", "Epoch 860/10000, Training Loss: 0.19635392725467682, Test Loss: 0.15806469321250916\n", "Epoch 861/10000, Training Loss: 0.19601212441921234, Test Loss: 0.15776216983795166\n", "Epoch 862/10000, Training Loss: 0.19567137956619263, Test Loss: 0.15746192634105682\n", "Epoch 863/10000, Training Loss: 0.1953316032886505, Test Loss: 0.15716394782066345\n", "Epoch 864/10000, Training Loss: 0.19499298930168152, Test Loss: 0.15686775743961334\n", "Epoch 865/10000, Training Loss: 0.19465532898902893, Test Loss: 0.15657295286655426\n", "Epoch 866/10000, Training Loss: 0.1943187266588211, Test Loss: 0.15627944469451904\n", "Epoch 867/10000, Training Loss: 0.19398319721221924, Test Loss: 0.15598705410957336\n", "Epoch 868/10000, Training Loss: 0.19364868104457855, Test Loss: 0.15569552779197693\n", "Epoch 869/10000, Training Loss: 0.19331520795822144, Test Loss: 0.15540474653244019\n", "Epoch 870/10000, Training Loss: 0.1929827332496643, Test Loss: 0.1551172286272049\n", "Epoch 871/10000, Training Loss: 0.19265125691890717, Test Loss: 0.15482984483242035\n", "Epoch 872/10000, Training Loss: 0.1923208087682724, Test Loss: 0.154542475938797\n", "Epoch 873/10000, Training Loss: 0.1919914036989212, Test Loss: 0.15425540506839752\n", "Epoch 874/10000, Training Loss: 0.1916629523038864, Test Loss: 0.1539686769247055\n", "Epoch 875/10000, Training Loss: 0.19133557379245758, Test Loss: 0.15368236601352692\n", "Epoch 876/10000, Training Loss: 0.19100914895534515, Test Loss: 0.15339630842208862\n", "Epoch 877/10000, Training Loss: 0.1906837373971939, Test Loss: 0.1531107872724533\n", "Epoch 878/10000, Training Loss: 0.19035930931568146, Test Loss: 0.15282604098320007\n", "Epoch 879/10000, Training Loss: 0.1900358647108078, Test Loss: 0.1525404453277588\n", "Epoch 880/10000, Training Loss: 0.18971362709999084, Test Loss: 0.15225501358509064\n", "Epoch 881/10000, Training Loss: 0.18939226865768433, Test Loss: 0.1519702970981598\n", "Epoch 882/10000, Training Loss: 0.18907180428504944, Test Loss: 0.1516866683959961\n", "Epoch 883/10000, Training Loss: 0.18875259160995483, Test Loss: 0.15140725672245026\n", "Epoch 884/10000, Training Loss: 0.1884344518184662, Test Loss: 0.15113186836242676\n", "Epoch 885/10000, Training Loss: 0.18811725080013275, Test Loss: 0.15086029469966888\n", "Epoch 886/10000, Training Loss: 0.1878010481595993, Test Loss: 0.15059219300746918\n", "Epoch 887/10000, Training Loss: 0.1874857395887375, Test Loss: 0.15032662451267242\n", "Epoch 888/10000, Training Loss: 0.18717139959335327, Test Loss: 0.15006238222122192\n", "Epoch 889/10000, Training Loss: 0.18685802817344666, Test Loss: 0.14979851245880127\n", "Epoch 890/10000, Training Loss: 0.18654559552669525, Test Loss: 0.14953435957431793\n", "Epoch 891/10000, Training Loss: 0.18623408675193787, Test Loss: 0.14926932752132416\n", "Epoch 892/10000, Training Loss: 0.18592354655265808, Test Loss: 0.1490035504102707\n", "Epoch 893/10000, Training Loss: 0.1856139600276947, Test Loss: 0.14873692393302917\n", "Epoch 894/10000, Training Loss: 0.18530526757240295, Test Loss: 0.14846958220005035\n", "Epoch 895/10000, Training Loss: 0.18499748408794403, Test Loss: 0.14820143580436707\n", "Epoch 896/10000, Training Loss: 0.18469077348709106, Test Loss: 0.1479303389787674\n", "Epoch 897/10000, Training Loss: 0.18438498675823212, Test Loss: 0.14765764772891998\n", "Epoch 898/10000, Training Loss: 0.18408003449440002, Test Loss: 0.14738434553146362\n", "Epoch 899/10000, Training Loss: 0.1837759166955948, Test Loss: 0.14711475372314453\n", "Epoch 900/10000, Training Loss: 0.1834729164838791, Test Loss: 0.1468491554260254\n", "Epoch 901/10000, Training Loss: 0.18317078053951263, Test Loss: 0.14658783376216888\n", "Epoch 902/10000, Training Loss: 0.182869553565979, Test Loss: 0.14633092284202576\n", "Epoch 903/10000, Training Loss: 0.1825692057609558, Test Loss: 0.14607766270637512\n", "Epoch 904/10000, Training Loss: 0.18226978182792664, Test Loss: 0.14582459628582\n", "Epoch 905/10000, Training Loss: 0.18197126686573029, Test Loss: 0.1455717235803604\n", "Epoch 906/10000, Training Loss: 0.18167361617088318, Test Loss: 0.14532141387462616\n", "Epoch 907/10000, Training Loss: 0.1813768893480301, Test Loss: 0.1450728178024292\n", "Epoch 908/10000, Training Loss: 0.18108108639717102, Test Loss: 0.14482514560222626\n", "Epoch 909/10000, Training Loss: 0.1807861030101776, Test Loss: 0.14457795023918152\n", "Epoch 910/10000, Training Loss: 0.1804920732975006, Test Loss: 0.14432789385318756\n", "Epoch 911/10000, Training Loss: 0.18019886314868927, Test Loss: 0.14407558739185333\n", "Epoch 912/10000, Training Loss: 0.17990665137767792, Test Loss: 0.14382489025592804\n", "Epoch 913/10000, Training Loss: 0.179615318775177, Test Loss: 0.14357589185237885\n", "Epoch 914/10000, Training Loss: 0.17932482063770294, Test Loss: 0.1433284878730774\n", "Epoch 915/10000, Training Loss: 0.17903515696525574, Test Loss: 0.1430821418762207\n", "Epoch 916/10000, Training Loss: 0.17874634265899658, Test Loss: 0.14283667504787445\n", "Epoch 917/10000, Training Loss: 0.17845837771892548, Test Loss: 0.14259187877178192\n", "Epoch 918/10000, Training Loss: 0.17817136645317078, Test Loss: 0.14234422147274017\n", "Epoch 919/10000, Training Loss: 0.1778852343559265, Test Loss: 0.142094686627388\n", "Epoch 920/10000, Training Loss: 0.17759986221790314, Test Loss: 0.14184688031673431\n", "Epoch 921/10000, Training Loss: 0.17731541395187378, Test Loss: 0.14160120487213135\n", "Epoch 922/10000, Training Loss: 0.17703180015087128, Test Loss: 0.14135794341564178\n", "Epoch 923/10000, Training Loss: 0.17674897611141205, Test Loss: 0.14111709594726562\n", "Epoch 924/10000, Training Loss: 0.176467165350914, Test Loss: 0.14087536931037903\n", "Epoch 925/10000, Training Loss: 0.17618605494499207, Test Loss: 0.14063318073749542\n", "Epoch 926/10000, Training Loss: 0.1759057641029358, Test Loss: 0.14039380848407745\n", "Epoch 927/10000, Training Loss: 0.17562636733055115, Test Loss: 0.14015725255012512\n", "Epoch 928/10000, Training Loss: 0.17534780502319336, Test Loss: 0.13992400467395782\n", "Epoch 929/10000, Training Loss: 0.17507009208202362, Test Loss: 0.13969390094280243\n", "Epoch 930/10000, Training Loss: 0.17479322850704193, Test Loss: 0.13946329057216644\n", "Epoch 931/10000, Training Loss: 0.17451713979244232, Test Loss: 0.13923189043998718\n", "Epoch 932/10000, Training Loss: 0.1742418110370636, Test Loss: 0.13900165259838104\n", "Epoch 933/10000, Training Loss: 0.1739674061536789, Test Loss: 0.1387718766927719\n", "Epoch 934/10000, Training Loss: 0.17369382083415985, Test Loss: 0.13854221999645233\n", "Epoch 935/10000, Training Loss: 0.1734210103750229, Test Loss: 0.13831301033496857\n", "Epoch 936/10000, Training Loss: 0.17314903438091278, Test Loss: 0.13808125257492065\n", "Epoch 937/10000, Training Loss: 0.17287789285182953, Test Loss: 0.13784781098365784\n", "Epoch 938/10000, Training Loss: 0.17260752618312836, Test Loss: 0.1376163214445114\n", "Epoch 939/10000, Training Loss: 0.17233800888061523, Test Loss: 0.13738730549812317\n", "Epoch 940/10000, Training Loss: 0.172069251537323, Test Loss: 0.13716015219688416\n", "Epoch 941/10000, Training Loss: 0.17180129885673523, Test Loss: 0.1369347721338272\n", "Epoch 942/10000, Training Loss: 0.17153410613536835, Test Loss: 0.1367110162973404\n", "Epoch 943/10000, Training Loss: 0.17126783728599548, Test Loss: 0.1364855170249939\n", "Epoch 944/10000, Training Loss: 0.17100226879119873, Test Loss: 0.1362592875957489\n", "Epoch 945/10000, Training Loss: 0.17073747515678406, Test Loss: 0.136032834649086\n", "Epoch 946/10000, Training Loss: 0.17047348618507385, Test Loss: 0.13580991327762604\n", "Epoch 947/10000, Training Loss: 0.1702103614807129, Test Loss: 0.13558973371982574\n", "Epoch 948/10000, Training Loss: 0.16994790732860565, Test Loss: 0.13537195324897766\n", "Epoch 949/10000, Training Loss: 0.16968627274036407, Test Loss: 0.13515618443489075\n", "Epoch 950/10000, Training Loss: 0.1694253832101822, Test Loss: 0.13494256138801575\n", "Epoch 951/10000, Training Loss: 0.1691652238368988, Test Loss: 0.1347300261259079\n", "Epoch 952/10000, Training Loss: 0.16890601813793182, Test Loss: 0.13451500236988068\n", "Epoch 953/10000, Training Loss: 0.16864749789237976, Test Loss: 0.13429731130599976\n", "Epoch 954/10000, Training Loss: 0.16839046776294708, Test Loss: 0.13405786454677582\n", "Epoch 955/10000, Training Loss: 0.16813403367996216, Test Loss: 0.13380244374275208\n", "Epoch 956/10000, Training Loss: 0.16787831485271454, Test Loss: 0.13354015350341797\n", "Epoch 957/10000, Training Loss: 0.16762331128120422, Test Loss: 0.13327716290950775\n", "Epoch 958/10000, Training Loss: 0.16736915707588196, Test Loss: 0.13301783800125122\n", "Epoch 959/10000, Training Loss: 0.1671159416437149, Test Loss: 0.13278640806674957\n", "Epoch 960/10000, Training Loss: 0.1668635755777359, Test Loss: 0.13258223235607147\n", "Epoch 961/10000, Training Loss: 0.16661189496517181, Test Loss: 0.1323794573545456\n", "Epoch 962/10000, Training Loss: 0.16636115312576294, Test Loss: 0.13217702507972717\n", "Epoch 963/10000, Training Loss: 0.1661110371351242, Test Loss: 0.13197404146194458\n", "Epoch 964/10000, Training Loss: 0.16586171090602875, Test Loss: 0.13176961243152618\n", "Epoch 965/10000, Training Loss: 0.16561301052570343, Test Loss: 0.13156342506408691\n", "Epoch 966/10000, Training Loss: 0.16536511480808258, Test Loss: 0.13135531544685364\n", "Epoch 967/10000, Training Loss: 0.16511785984039307, Test Loss: 0.13114535808563232\n", "Epoch 968/10000, Training Loss: 0.1648714244365692, Test Loss: 0.1309339702129364\n", "Epoch 969/10000, Training Loss: 0.1646256446838379, Test Loss: 0.1307218372821808\n", "Epoch 970/10000, Training Loss: 0.16438056528568268, Test Loss: 0.130509614944458\n", "Epoch 971/10000, Training Loss: 0.16413623094558716, Test Loss: 0.13029809296131134\n", "Epoch 972/10000, Training Loss: 0.1638927012681961, Test Loss: 0.13008785247802734\n", "Epoch 973/10000, Training Loss: 0.16364985704421997, Test Loss: 0.12987947463989258\n", "Epoch 974/10000, Training Loss: 0.16340789198875427, Test Loss: 0.12967613339424133\n", "Epoch 975/10000, Training Loss: 0.1631666123867035, Test Loss: 0.12949661910533905\n", "Epoch 976/10000, Training Loss: 0.16292594373226166, Test Loss: 0.12931692600250244\n", "Epoch 977/10000, Training Loss: 0.16268600523471832, Test Loss: 0.12913161516189575\n", "Epoch 978/10000, Training Loss: 0.1624469757080078, Test Loss: 0.12893953919410706\n", "Epoch 979/10000, Training Loss: 0.1622086465358734, Test Loss: 0.12874038517475128\n", "Epoch 980/10000, Training Loss: 0.16197092831134796, Test Loss: 0.12853482365608215\n", "Epoch 981/10000, Training Loss: 0.1617339551448822, Test Loss: 0.12832419574260712\n", "Epoch 982/10000, Training Loss: 0.16149793565273285, Test Loss: 0.12812970578670502\n", "Epoch 983/10000, Training Loss: 0.16126246750354767, Test Loss: 0.12795273959636688\n", "Epoch 984/10000, Training Loss: 0.16102756559848785, Test Loss: 0.12777037918567657\n", "Epoch 985/10000, Training Loss: 0.1607935130596161, Test Loss: 0.12758173048496246\n", "Epoch 986/10000, Training Loss: 0.16056019067764282, Test Loss: 0.12738372385501862\n", "Epoch 987/10000, Training Loss: 0.1603275090456009, Test Loss: 0.12717732787132263\n", "Epoch 988/10000, Training Loss: 0.1600954681634903, Test Loss: 0.12696492671966553\n", "Epoch 989/10000, Training Loss: 0.1598641276359558, Test Loss: 0.12675003707408905\n", "Epoch 990/10000, Training Loss: 0.1596338450908661, Test Loss: 0.12655428051948547\n", "Epoch 991/10000, Training Loss: 0.15940386056900024, Test Loss: 0.12637637555599213\n", "Epoch 992/10000, Training Loss: 0.15917444229125977, Test Loss: 0.12619708478450775\n", "Epoch 993/10000, Training Loss: 0.15894591808319092, Test Loss: 0.1260150521993637\n", "Epoch 994/10000, Training Loss: 0.15871815383434296, Test Loss: 0.12582677602767944\n", "Epoch 995/10000, Training Loss: 0.15849106013774872, Test Loss: 0.12563224136829376\n", "Epoch 996/10000, Training Loss: 0.15826457738876343, Test Loss: 0.12543286383152008\n", "Epoch 997/10000, Training Loss: 0.15803872048854828, Test Loss: 0.12523049116134644\n", "Epoch 998/10000, Training Loss: 0.15781360864639282, Test Loss: 0.12503045797348022\n", "Epoch 999/10000, Training Loss: 0.1575891375541687, Test Loss: 0.12483415007591248\n", "Epoch 1000/10000, Training Loss: 0.15736538171768188, Test Loss: 0.12463900446891785\n", "Epoch 1001/10000, Training Loss: 0.15714232623577118, Test Loss: 0.1244468167424202\n", "Epoch 1002/10000, Training Loss: 0.1569199413061142, Test Loss: 0.1242581456899643\n", "Epoch 1003/10000, Training Loss: 0.15669816732406616, Test Loss: 0.12407246977090836\n", "Epoch 1004/10000, Training Loss: 0.15647703409194946, Test Loss: 0.1238897442817688\n", "Epoch 1005/10000, Training Loss: 0.15625658631324768, Test Loss: 0.12370961159467697\n", "Epoch 1006/10000, Training Loss: 0.15603677928447723, Test Loss: 0.12353166192770004\n", "Epoch 1007/10000, Training Loss: 0.15581759810447693, Test Loss: 0.12335516512393951\n", "Epoch 1008/10000, Training Loss: 0.15559911727905273, Test Loss: 0.12317966669797897\n", "Epoch 1009/10000, Training Loss: 0.1553812474012375, Test Loss: 0.12300483882427216\n", "Epoch 1010/10000, Training Loss: 0.15516403317451477, Test Loss: 0.12283015996217728\n", "Epoch 1011/10000, Training Loss: 0.15494748950004578, Test Loss: 0.12265759706497192\n", "Epoch 1012/10000, Training Loss: 0.1547316163778305, Test Loss: 0.12248704582452774\n", "Epoch 1013/10000, Training Loss: 0.1545162945985794, Test Loss: 0.12231460213661194\n", "Epoch 1014/10000, Training Loss: 0.1543016880750656, Test Loss: 0.1221390888094902\n", "Epoch 1015/10000, Training Loss: 0.15408766269683838, Test Loss: 0.12196078151464462\n", "Epoch 1016/10000, Training Loss: 0.15387427806854248, Test Loss: 0.12178053706884384\n", "Epoch 1017/10000, Training Loss: 0.15366153419017792, Test Loss: 0.12159998714923859\n", "Epoch 1018/10000, Training Loss: 0.1534494310617447, Test Loss: 0.12141972780227661\n", "Epoch 1019/10000, Training Loss: 0.15323787927627563, Test Loss: 0.12123972922563553\n", "Epoch 1020/10000, Training Loss: 0.15302705764770508, Test Loss: 0.1210632473230362\n", "Epoch 1021/10000, Training Loss: 0.1528168022632599, Test Loss: 0.12089046835899353\n", "Epoch 1022/10000, Training Loss: 0.15260714292526245, Test Loss: 0.12071801722049713\n", "Epoch 1023/10000, Training Loss: 0.15239813923835754, Test Loss: 0.12054603546857834\n", "Epoch 1024/10000, Training Loss: 0.1521897315979004, Test Loss: 0.1203753799200058\n", "Epoch 1025/10000, Training Loss: 0.15198197960853577, Test Loss: 0.12020577490329742\n", "Epoch 1026/10000, Training Loss: 0.15177470445632935, Test Loss: 0.12003728002309799\n", "Epoch 1027/10000, Training Loss: 0.15156811475753784, Test Loss: 0.11986886709928513\n", "Epoch 1028/10000, Training Loss: 0.1513621211051941, Test Loss: 0.11970095336437225\n", "Epoch 1029/10000, Training Loss: 0.1511567384004593, Test Loss: 0.1195334866642952\n", "Epoch 1030/10000, Training Loss: 0.15095195174217224, Test Loss: 0.11936637759208679\n", "Epoch 1031/10000, Training Loss: 0.15074771642684937, Test Loss: 0.11919975280761719\n", "Epoch 1032/10000, Training Loss: 0.15054410696029663, Test Loss: 0.1190330907702446\n", "Epoch 1033/10000, Training Loss: 0.15034103393554688, Test Loss: 0.11886728554964066\n", "Epoch 1034/10000, Training Loss: 0.1501387059688568, Test Loss: 0.11870535463094711\n", "Epoch 1035/10000, Training Loss: 0.14993681013584137, Test Loss: 0.11854664981365204\n", "Epoch 1036/10000, Training Loss: 0.14973551034927368, Test Loss: 0.11838649958372116\n", "Epoch 1037/10000, Training Loss: 0.14953483641147614, Test Loss: 0.1182251125574112\n", "Epoch 1038/10000, Training Loss: 0.14933475852012634, Test Loss: 0.11806243658065796\n", "Epoch 1039/10000, Training Loss: 0.1491352617740631, Test Loss: 0.11789871007204056\n", "Epoch 1040/10000, Training Loss: 0.14893627166748047, Test Loss: 0.11773498356342316\n", "Epoch 1041/10000, Training Loss: 0.14873792231082916, Test Loss: 0.11757111549377441\n", "Epoch 1042/10000, Training Loss: 0.14854010939598083, Test Loss: 0.11740763485431671\n", "Epoch 1043/10000, Training Loss: 0.14834286272525787, Test Loss: 0.11724482476711273\n", "Epoch 1044/10000, Training Loss: 0.14814616739749908, Test Loss: 0.11708232760429382\n", "Epoch 1045/10000, Training Loss: 0.14795009791851044, Test Loss: 0.11692040413618088\n", "Epoch 1046/10000, Training Loss: 0.14775453507900238, Test Loss: 0.11676304787397385\n", "Epoch 1047/10000, Training Loss: 0.1475595384836197, Test Loss: 0.11660967767238617\n", "Epoch 1048/10000, Training Loss: 0.14736509323120117, Test Loss: 0.11645650118589401\n", "Epoch 1049/10000, Training Loss: 0.14717140793800354, Test Loss: 0.11629221588373184\n", "Epoch 1050/10000, Training Loss: 0.1469782441854477, Test Loss: 0.11611896753311157\n", "Epoch 1051/10000, Training Loss: 0.14678551256656647, Test Loss: 0.11593931913375854\n", "Epoch 1052/10000, Training Loss: 0.1465933471918106, Test Loss: 0.11575670540332794\n", "Epoch 1053/10000, Training Loss: 0.1464017927646637, Test Loss: 0.11558549106121063\n", "Epoch 1054/10000, Training Loss: 0.1462109088897705, Test Loss: 0.1154264435172081\n", "Epoch 1055/10000, Training Loss: 0.14602042734622955, Test Loss: 0.11527903378009796\n", "Epoch 1056/10000, Training Loss: 0.14583057165145874, Test Loss: 0.11512990295886993\n", "Epoch 1057/10000, Training Loss: 0.1456412971019745, Test Loss: 0.11497880518436432\n", "Epoch 1058/10000, Training Loss: 0.14545254409313202, Test Loss: 0.11482582241296768\n", "Epoch 1059/10000, Training Loss: 0.14526429772377014, Test Loss: 0.11467183381319046\n", "Epoch 1060/10000, Training Loss: 0.14507654309272766, Test Loss: 0.11451727896928787\n", "Epoch 1061/10000, Training Loss: 0.14488941431045532, Test Loss: 0.11436295509338379\n", "Epoch 1062/10000, Training Loss: 0.14470282196998596, Test Loss: 0.11420932412147522\n", "Epoch 1063/10000, Training Loss: 0.1445166915655136, Test Loss: 0.11405665427446365\n", "Epoch 1064/10000, Training Loss: 0.14433112740516663, Test Loss: 0.11390450596809387\n", "Epoch 1065/10000, Training Loss: 0.14414608478546143, Test Loss: 0.11375322937965393\n", "Epoch 1066/10000, Training Loss: 0.1439615935087204, Test Loss: 0.11360345035791397\n", "Epoch 1067/10000, Training Loss: 0.14377762377262115, Test Loss: 0.11345480382442474\n", "Epoch 1068/10000, Training Loss: 0.14359420537948608, Test Loss: 0.11331840604543686\n", "Epoch 1069/10000, Training Loss: 0.14341123402118683, Test Loss: 0.11318112909793854\n", "Epoch 1070/10000, Training Loss: 0.14322887361049652, Test Loss: 0.11304238438606262\n", "Epoch 1071/10000, Training Loss: 0.1430470049381256, Test Loss: 0.11290141940116882\n", "Epoch 1072/10000, Training Loss: 0.14286565780639648, Test Loss: 0.11275716125965118\n", "Epoch 1073/10000, Training Loss: 0.14268481731414795, Test Loss: 0.11261020600795746\n", "Epoch 1074/10000, Training Loss: 0.1425045132637024, Test Loss: 0.11246120184659958\n", "Epoch 1075/10000, Training Loss: 0.14232474565505981, Test Loss: 0.11232208460569382\n", "Epoch 1076/10000, Training Loss: 0.14214548468589783, Test Loss: 0.1121913492679596\n", "Epoch 1077/10000, Training Loss: 0.14196670055389404, Test Loss: 0.11205586791038513\n", "Epoch 1078/10000, Training Loss: 0.14178846776485443, Test Loss: 0.11191431432962418\n", "Epoch 1079/10000, Training Loss: 0.14161071181297302, Test Loss: 0.11176793277263641\n", "Epoch 1080/10000, Training Loss: 0.1414334774017334, Test Loss: 0.11161762475967407\n", "Epoch 1081/10000, Training Loss: 0.1412566751241684, Test Loss: 0.11146491020917892\n", "Epoch 1082/10000, Training Loss: 0.14108045399188995, Test Loss: 0.11131216585636139\n", "Epoch 1083/10000, Training Loss: 0.14090465009212494, Test Loss: 0.1111605316400528\n", "Epoch 1084/10000, Training Loss: 0.14072951674461365, Test Loss: 0.11102210730314255\n", "Epoch 1085/10000, Training Loss: 0.14055469632148743, Test Loss: 0.1108955666422844\n", "Epoch 1086/10000, Training Loss: 0.14038044214248657, Test Loss: 0.11076673865318298\n", "Epoch 1087/10000, Training Loss: 0.1402066946029663, Test Loss: 0.1106346920132637\n", "Epoch 1088/10000, Training Loss: 0.14003345370292664, Test Loss: 0.11049772799015045\n", "Epoch 1089/10000, Training Loss: 0.13986070454120636, Test Loss: 0.11035678535699844\n", "Epoch 1090/10000, Training Loss: 0.1396884322166443, Test Loss: 0.11021264642477036\n", "Epoch 1091/10000, Training Loss: 0.1395166516304016, Test Loss: 0.11006738245487213\n", "Epoch 1092/10000, Training Loss: 0.13934534788131714, Test Loss: 0.10992226749658585\n", "Epoch 1093/10000, Training Loss: 0.13917455077171326, Test Loss: 0.10977741330862045\n", "Epoch 1094/10000, Training Loss: 0.13900423049926758, Test Loss: 0.10963404178619385\n", "Epoch 1095/10000, Training Loss: 0.1388344019651413, Test Loss: 0.10949305444955826\n", "Epoch 1096/10000, Training Loss: 0.13866503536701202, Test Loss: 0.10935502499341965\n", "Epoch 1097/10000, Training Loss: 0.13849616050720215, Test Loss: 0.10922018438577652\n", "Epoch 1098/10000, Training Loss: 0.1383277177810669, Test Loss: 0.10908889770507812\n", "Epoch 1099/10000, Training Loss: 0.13815978169441223, Test Loss: 0.10896199196577072\n", "Epoch 1100/10000, Training Loss: 0.13799229264259338, Test Loss: 0.10883793979883194\n", "Epoch 1101/10000, Training Loss: 0.13782534003257751, Test Loss: 0.10871574282646179\n", "Epoch 1102/10000, Training Loss: 0.1376587599515915, Test Loss: 0.10859401524066925\n", "Epoch 1103/10000, Training Loss: 0.13749265670776367, Test Loss: 0.10847163945436478\n", "Epoch 1104/10000, Training Loss: 0.13732707500457764, Test Loss: 0.10834692418575287\n", "Epoch 1105/10000, Training Loss: 0.13716191053390503, Test Loss: 0.10821942985057831\n", "Epoch 1106/10000, Training Loss: 0.13699720799922943, Test Loss: 0.10808976739645004\n", "Epoch 1107/10000, Training Loss: 0.13683298230171204, Test Loss: 0.10795766115188599\n", "Epoch 1108/10000, Training Loss: 0.13666920363903046, Test Loss: 0.10782381892204285\n", "Epoch 1109/10000, Training Loss: 0.1365058571100235, Test Loss: 0.10768888890743256\n", "Epoch 1110/10000, Training Loss: 0.13634295761585236, Test Loss: 0.10755518823862076\n", "Epoch 1111/10000, Training Loss: 0.1361805498600006, Test Loss: 0.10742297023534775\n", "Epoch 1112/10000, Training Loss: 0.1360185444355011, Test Loss: 0.10729284584522247\n", "Epoch 1113/10000, Training Loss: 0.1358570158481598, Test Loss: 0.10716494172811508\n", "Epoch 1114/10000, Training Loss: 0.1356959193944931, Test Loss: 0.10704049468040466\n", "Epoch 1115/10000, Training Loss: 0.13553525507450104, Test Loss: 0.10691745579242706\n", "Epoch 1116/10000, Training Loss: 0.13537506759166718, Test Loss: 0.10679493099451065\n", "Epoch 1117/10000, Training Loss: 0.13521529734134674, Test Loss: 0.10667212307453156\n", "Epoch 1118/10000, Training Loss: 0.13505598902702332, Test Loss: 0.10654886066913605\n", "Epoch 1119/10000, Training Loss: 0.13489705324172974, Test Loss: 0.10642503201961517\n", "Epoch 1120/10000, Training Loss: 0.13473857939243317, Test Loss: 0.10630124807357788\n", "Epoch 1121/10000, Training Loss: 0.1345806121826172, Test Loss: 0.10617820918560028\n", "Epoch 1122/10000, Training Loss: 0.13442297279834747, Test Loss: 0.10605540126562119\n", "Epoch 1123/10000, Training Loss: 0.13426578044891357, Test Loss: 0.10593430697917938\n", "Epoch 1124/10000, Training Loss: 0.13410910964012146, Test Loss: 0.10581433773040771\n", "Epoch 1125/10000, Training Loss: 0.1339527666568756, Test Loss: 0.10569518059492111\n", "Epoch 1126/10000, Training Loss: 0.13379688560962677, Test Loss: 0.10557523369789124\n", "Epoch 1127/10000, Training Loss: 0.13364143669605255, Test Loss: 0.10545410960912704\n", "Epoch 1128/10000, Training Loss: 0.13348634541034698, Test Loss: 0.10533193498849869\n", "Epoch 1129/10000, Training Loss: 0.133331760764122, Test Loss: 0.1052091047167778\n", "Epoch 1130/10000, Training Loss: 0.13317754864692688, Test Loss: 0.1050885021686554\n", "Epoch 1131/10000, Training Loss: 0.13302375376224518, Test Loss: 0.10497023910284042\n", "Epoch 1132/10000, Training Loss: 0.1328703761100769, Test Loss: 0.10485265403985977\n", "Epoch 1133/10000, Training Loss: 0.13271741569042206, Test Loss: 0.10473552346229553\n", "Epoch 1134/10000, Training Loss: 0.13256490230560303, Test Loss: 0.10462068021297455\n", "Epoch 1135/10000, Training Loss: 0.13241280615329742, Test Loss: 0.10450746864080429\n", "Epoch 1136/10000, Training Loss: 0.13226106762886047, Test Loss: 0.10439229011535645\n", "Epoch 1137/10000, Training Loss: 0.13210976123809814, Test Loss: 0.10427558422088623\n", "Epoch 1138/10000, Training Loss: 0.13195888698101044, Test Loss: 0.10415764153003693\n", "Epoch 1139/10000, Training Loss: 0.13180840015411377, Test Loss: 0.10403949022293091\n", "Epoch 1140/10000, Training Loss: 0.13165831565856934, Test Loss: 0.10392152518033981\n", "Epoch 1141/10000, Training Loss: 0.13150864839553833, Test Loss: 0.10380387306213379\n", "Epoch 1142/10000, Training Loss: 0.13135932385921478, Test Loss: 0.10368644446134567\n", "Epoch 1143/10000, Training Loss: 0.13121047616004944, Test Loss: 0.10357198119163513\n", "Epoch 1144/10000, Training Loss: 0.13106197118759155, Test Loss: 0.10346024483442307\n", "Epoch 1145/10000, Training Loss: 0.1309138536453247, Test Loss: 0.10334829241037369\n", "Epoch 1146/10000, Training Loss: 0.13076618313789368, Test Loss: 0.10323585569858551\n", "Epoch 1147/10000, Training Loss: 0.13061891496181488, Test Loss: 0.10312352329492569\n", "Epoch 1148/10000, Training Loss: 0.13047200441360474, Test Loss: 0.10301147401332855\n", "Epoch 1149/10000, Training Loss: 0.13032548129558563, Test Loss: 0.10289908945560455\n", "Epoch 1150/10000, Training Loss: 0.13017934560775757, Test Loss: 0.10278591513633728\n", "Epoch 1151/10000, Training Loss: 0.13003361225128174, Test Loss: 0.10267249494791031\n", "Epoch 1152/10000, Training Loss: 0.12988823652267456, Test Loss: 0.102559894323349\n", "Epoch 1153/10000, Training Loss: 0.12974324822425842, Test Loss: 0.10245133191347122\n", "Epoch 1154/10000, Training Loss: 0.1295986920595169, Test Loss: 0.10234205424785614\n", "Epoch 1155/10000, Training Loss: 0.12945447862148285, Test Loss: 0.10223256796598434\n", "Epoch 1156/10000, Training Loss: 0.12931068241596222, Test Loss: 0.10212597250938416\n", "Epoch 1157/10000, Training Loss: 0.12916724383831024, Test Loss: 0.10201912373304367\n", "Epoch 1158/10000, Training Loss: 0.1290242224931717, Test Loss: 0.10191120952367783\n", "Epoch 1159/10000, Training Loss: 0.1288815140724182, Test Loss: 0.10180234909057617\n", "Epoch 1160/10000, Training Loss: 0.12873926758766174, Test Loss: 0.1016928181052208\n", "Epoch 1161/10000, Training Loss: 0.12859731912612915, Test Loss: 0.10158339142799377\n", "Epoch 1162/10000, Training Loss: 0.1284557431936264, Test Loss: 0.10147372633218765\n", "Epoch 1163/10000, Training Loss: 0.1283145546913147, Test Loss: 0.10136711597442627\n", "Epoch 1164/10000, Training Loss: 0.12817378342151642, Test Loss: 0.10126031935214996\n", "Epoch 1165/10000, Training Loss: 0.1280333399772644, Test Loss: 0.10115405917167664\n", "Epoch 1166/10000, Training Loss: 0.12789323925971985, Test Loss: 0.10104817152023315\n", "Epoch 1167/10000, Training Loss: 0.12775354087352753, Test Loss: 0.10094217211008072\n", "Epoch 1168/10000, Training Loss: 0.12761421501636505, Test Loss: 0.10083861649036407\n", "Epoch 1169/10000, Training Loss: 0.12747526168823242, Test Loss: 0.1007346510887146\n", "Epoch 1170/10000, Training Loss: 0.12733665108680725, Test Loss: 0.10063008964061737\n", "Epoch 1171/10000, Training Loss: 0.12719842791557312, Test Loss: 0.10052584856748581\n", "Epoch 1172/10000, Training Loss: 0.12706053256988525, Test Loss: 0.10042175650596619\n", "Epoch 1173/10000, Training Loss: 0.12692297995090485, Test Loss: 0.10031784325838089\n", "Epoch 1174/10000, Training Loss: 0.12678584456443787, Test Loss: 0.10021364688873291\n", "Epoch 1175/10000, Training Loss: 0.12664903700351715, Test Loss: 0.10011254251003265\n", "Epoch 1176/10000, Training Loss: 0.1265125721693039, Test Loss: 0.10001145303249359\n", "Epoch 1177/10000, Training Loss: 0.12637652456760406, Test Loss: 0.09990967065095901\n", "Epoch 1178/10000, Training Loss: 0.12624076008796692, Test Loss: 0.09980719536542892\n", "Epoch 1179/10000, Training Loss: 0.12610536813735962, Test Loss: 0.09970428794622421\n", "Epoch 1180/10000, Training Loss: 0.12597034871578217, Test Loss: 0.09960097819566727\n", "Epoch 1181/10000, Training Loss: 0.12583564221858978, Test Loss: 0.0994981899857521\n", "Epoch 1182/10000, Training Loss: 0.12570133805274963, Test Loss: 0.09939876198768616\n", "Epoch 1183/10000, Training Loss: 0.12556733191013336, Test Loss: 0.09929939359426498\n", "Epoch 1184/10000, Training Loss: 0.12543372809886932, Test Loss: 0.09920056164264679\n", "Epoch 1185/10000, Training Loss: 0.12530042231082916, Test Loss: 0.09910202771425247\n", "Epoch 1186/10000, Training Loss: 0.12516747415065765, Test Loss: 0.09900364279747009\n", "Epoch 1187/10000, Training Loss: 0.1250348687171936, Test Loss: 0.09890536963939667\n", "Epoch 1188/10000, Training Loss: 0.1249026209115982, Test Loss: 0.09880928695201874\n", "Epoch 1189/10000, Training Loss: 0.12477072328329086, Test Loss: 0.09871144592761993\n", "Epoch 1190/10000, Training Loss: 0.1246391162276268, Test Loss: 0.0986124575138092\n", "Epoch 1191/10000, Training Loss: 0.12450786679983139, Test Loss: 0.0985134169459343\n", "Epoch 1192/10000, Training Loss: 0.12437703460454941, Test Loss: 0.09841455519199371\n", "Epoch 1193/10000, Training Loss: 0.12424644827842712, Test Loss: 0.09831556677818298\n", "Epoch 1194/10000, Training Loss: 0.12411618232727051, Test Loss: 0.09821731597185135\n", "Epoch 1195/10000, Training Loss: 0.12398631870746613, Test Loss: 0.09811988472938538\n", "Epoch 1196/10000, Training Loss: 0.12385677546262741, Test Loss: 0.09802334755659103\n", "Epoch 1197/10000, Training Loss: 0.12372756749391556, Test Loss: 0.09793049097061157\n", "Epoch 1198/10000, Training Loss: 0.12359868735074997, Test Loss: 0.09783642739057541\n", "Epoch 1199/10000, Training Loss: 0.12347011268138885, Test Loss: 0.09774173051118851\n", "Epoch 1200/10000, Training Loss: 0.123341865837574, Test Loss: 0.09764641523361206\n", "Epoch 1201/10000, Training Loss: 0.1232139989733696, Test Loss: 0.09755143523216248\n", "Epoch 1202/10000, Training Loss: 0.12308644503355026, Test Loss: 0.09745682030916214\n", "Epoch 1203/10000, Training Loss: 0.122959204018116, Test Loss: 0.09736219793558121\n", "Epoch 1204/10000, Training Loss: 0.12283224612474442, Test Loss: 0.0972682386636734\n", "Epoch 1205/10000, Training Loss: 0.12270563840866089, Test Loss: 0.09717478603124619\n", "Epoch 1206/10000, Training Loss: 0.12257939577102661, Test Loss: 0.09708477556705475\n", "Epoch 1207/10000, Training Loss: 0.12245343625545502, Test Loss: 0.09699411690235138\n", "Epoch 1208/10000, Training Loss: 0.1223277896642685, Test Loss: 0.09690297394990921\n", "Epoch 1209/10000, Training Loss: 0.12220251560211182, Test Loss: 0.09681054949760437\n", "Epoch 1210/10000, Training Loss: 0.12207750976085663, Test Loss: 0.09671711921691895\n", "Epoch 1211/10000, Training Loss: 0.1219528466463089, Test Loss: 0.09662307053804398\n", "Epoch 1212/10000, Training Loss: 0.12182848900556564, Test Loss: 0.09652954339981079\n", "Epoch 1213/10000, Training Loss: 0.12170445919036865, Test Loss: 0.09643677622079849\n", "Epoch 1214/10000, Training Loss: 0.12158073484897614, Test Loss: 0.09634492546319962\n", "Epoch 1215/10000, Training Loss: 0.12145733833312988, Test Loss: 0.09625636786222458\n", "Epoch 1216/10000, Training Loss: 0.12133427709341049, Test Loss: 0.09616817533969879\n", "Epoch 1217/10000, Training Loss: 0.1212114617228508, Test Loss: 0.09608025848865509\n", "Epoch 1218/10000, Training Loss: 0.12108900398015976, Test Loss: 0.09599223732948303\n", "Epoch 1219/10000, Training Loss: 0.120966836810112, Test Loss: 0.09590358287096024\n", "Epoch 1220/10000, Training Loss: 0.1208450198173523, Test Loss: 0.0958147719502449\n", "Epoch 1221/10000, Training Loss: 0.1207234337925911, Test Loss: 0.09572558850049973\n", "Epoch 1222/10000, Training Loss: 0.12060222774744034, Test Loss: 0.09563621878623962\n", "Epoch 1223/10000, Training Loss: 0.12048131227493286, Test Loss: 0.095546193420887\n", "Epoch 1224/10000, Training Loss: 0.12036065757274628, Test Loss: 0.09545598924160004\n", "Epoch 1225/10000, Training Loss: 0.12024035304784775, Test Loss: 0.095365971326828\n", "Epoch 1226/10000, Training Loss: 0.12012030184268951, Test Loss: 0.09527702629566193\n", "Epoch 1227/10000, Training Loss: 0.12000063806772232, Test Loss: 0.09518936276435852\n", "Epoch 1228/10000, Training Loss: 0.11988124996423721, Test Loss: 0.09510296583175659\n", "Epoch 1229/10000, Training Loss: 0.1197621300816536, Test Loss: 0.09502000361680984\n", "Epoch 1230/10000, Training Loss: 0.11964330077171326, Test Loss: 0.09493689984083176\n", "Epoch 1231/10000, Training Loss: 0.11952480673789978, Test Loss: 0.09485362470149994\n", "Epoch 1232/10000, Training Loss: 0.11940661072731018, Test Loss: 0.09476977586746216\n", "Epoch 1233/10000, Training Loss: 0.11928867548704147, Test Loss: 0.09468526393175125\n", "Epoch 1234/10000, Training Loss: 0.11917101591825485, Test Loss: 0.09460008889436722\n", "Epoch 1235/10000, Training Loss: 0.11905371397733688, Test Loss: 0.09451448172330856\n", "Epoch 1236/10000, Training Loss: 0.118936687707901, Test Loss: 0.09442861378192902\n", "Epoch 1237/10000, Training Loss: 0.1188199520111084, Test Loss: 0.09434270113706589\n", "Epoch 1238/10000, Training Loss: 0.11870351433753967, Test Loss: 0.09425697475671768\n", "Epoch 1239/10000, Training Loss: 0.11858732998371124, Test Loss: 0.09417160600423813\n", "Epoch 1240/10000, Training Loss: 0.11847147345542908, Test Loss: 0.09408620744943619\n", "Epoch 1241/10000, Training Loss: 0.1183559000492096, Test Loss: 0.09400098025798798\n", "Epoch 1242/10000, Training Loss: 0.1182406097650528, Test Loss: 0.09391660988330841\n", "Epoch 1243/10000, Training Loss: 0.11812564730644226, Test Loss: 0.09383326768875122\n", "Epoch 1244/10000, Training Loss: 0.11801088601350784, Test Loss: 0.09375089406967163\n", "Epoch 1245/10000, Training Loss: 0.11789649724960327, Test Loss: 0.093669593334198\n", "Epoch 1246/10000, Training Loss: 0.11778232455253601, Test Loss: 0.09359192848205566\n", "Epoch 1247/10000, Training Loss: 0.11766847968101501, Test Loss: 0.09351380169391632\n", "Epoch 1248/10000, Training Loss: 0.11755495518445969, Test Loss: 0.09343467652797699\n", "Epoch 1249/10000, Training Loss: 0.11744162440299988, Test Loss: 0.09335488080978394\n", "Epoch 1250/10000, Training Loss: 0.11732860654592514, Test Loss: 0.09327418357133865\n", "Epoch 1251/10000, Training Loss: 0.11721589416265488, Test Loss: 0.09319277107715607\n", "Epoch 1252/10000, Training Loss: 0.11710341274738312, Test Loss: 0.09311095625162125\n", "Epoch 1253/10000, Training Loss: 0.11699127405881882, Test Loss: 0.09302835166454315\n", "Epoch 1254/10000, Training Loss: 0.11687935888767242, Test Loss: 0.09294603019952774\n", "Epoch 1255/10000, Training Loss: 0.1167677566409111, Test Loss: 0.09286440163850784\n", "Epoch 1256/10000, Training Loss: 0.11665637791156769, Test Loss: 0.09278371185064316\n", "Epoch 1257/10000, Training Loss: 0.11654531955718994, Test Loss: 0.09270405769348145\n", "Epoch 1258/10000, Training Loss: 0.11643452942371368, Test Loss: 0.09262549132108688\n", "Epoch 1259/10000, Training Loss: 0.11632400751113892, Test Loss: 0.09254772216081619\n", "Epoch 1260/10000, Training Loss: 0.11621376127004623, Test Loss: 0.09247051924467087\n", "Epoch 1261/10000, Training Loss: 0.11610379815101624, Test Loss: 0.09239327162504196\n", "Epoch 1262/10000, Training Loss: 0.11599408835172653, Test Loss: 0.09231667220592499\n", "Epoch 1263/10000, Training Loss: 0.11588466912508011, Test Loss: 0.09224019199609756\n", "Epoch 1264/10000, Training Loss: 0.11577550321817398, Test Loss: 0.09216392040252686\n", "Epoch 1265/10000, Training Loss: 0.11566659063100815, Test Loss: 0.0920877605676651\n", "Epoch 1266/10000, Training Loss: 0.11555797606706619, Test Loss: 0.09201138466596603\n", "Epoch 1267/10000, Training Loss: 0.11544961482286453, Test Loss: 0.09193504601716995\n", "Epoch 1268/10000, Training Loss: 0.11534152925014496, Test Loss: 0.09185860306024551\n", "Epoch 1269/10000, Training Loss: 0.11523368954658508, Test Loss: 0.09178200364112854\n", "Epoch 1270/10000, Training Loss: 0.11512615531682968, Test Loss: 0.09170548617839813\n", "Epoch 1271/10000, Training Loss: 0.1150188222527504, Test Loss: 0.09162870794534683\n", "Epoch 1272/10000, Training Loss: 0.11491179466247559, Test Loss: 0.09155216813087463\n", "Epoch 1273/10000, Training Loss: 0.11480502784252167, Test Loss: 0.09147609770298004\n", "Epoch 1274/10000, Training Loss: 0.11469849944114685, Test Loss: 0.09140073508024216\n", "Epoch 1275/10000, Training Loss: 0.11459223181009293, Test Loss: 0.09132584929466248\n", "Epoch 1276/10000, Training Loss: 0.11448624730110168, Test Loss: 0.09125165641307831\n", "Epoch 1277/10000, Training Loss: 0.11438049376010895, Test Loss: 0.09117801487445831\n", "Epoch 1278/10000, Training Loss: 0.11427506804466248, Test Loss: 0.09110401570796967\n", "Epoch 1279/10000, Training Loss: 0.11416982114315033, Test Loss: 0.09103050827980042\n", "Epoch 1280/10000, Training Loss: 0.11406486481428146, Test Loss: 0.09095731377601624\n", "Epoch 1281/10000, Training Loss: 0.1139601618051529, Test Loss: 0.09088453650474548\n", "Epoch 1282/10000, Training Loss: 0.11385568976402283, Test Loss: 0.09081213921308517\n", "Epoch 1283/10000, Training Loss: 0.11375153809785843, Test Loss: 0.09073984622955322\n", "Epoch 1284/10000, Training Loss: 0.11364755034446716, Test Loss: 0.09066776186227798\n", "Epoch 1285/10000, Training Loss: 0.11354385316371918, Test Loss: 0.09059547632932663\n", "Epoch 1286/10000, Training Loss: 0.11344043910503387, Test Loss: 0.0905231460928917\n", "Epoch 1287/10000, Training Loss: 0.1133371964097023, Test Loss: 0.0904507040977478\n", "Epoch 1288/10000, Training Loss: 0.11323428153991699, Test Loss: 0.09037800878286362\n", "Epoch 1289/10000, Training Loss: 0.1131315752863884, Test Loss: 0.09030569344758987\n", "Epoch 1290/10000, Training Loss: 0.1130291223526001, Test Loss: 0.0902339294552803\n", "Epoch 1291/10000, Training Loss: 0.11292695254087448, Test Loss: 0.09016264975070953\n", "Epoch 1292/10000, Training Loss: 0.11282497644424438, Test Loss: 0.09009193629026413\n", "Epoch 1293/10000, Training Loss: 0.11272327601909637, Test Loss: 0.09002164751291275\n", "Epoch 1294/10000, Training Loss: 0.11262184381484985, Test Loss: 0.08995167165994644\n", "Epoch 1295/10000, Training Loss: 0.11252062767744064, Test Loss: 0.08988181501626968\n", "Epoch 1296/10000, Training Loss: 0.11241964995861053, Test Loss: 0.08981198817491531\n", "Epoch 1297/10000, Training Loss: 0.11231891810894012, Test Loss: 0.08974217623472214\n", "Epoch 1298/10000, Training Loss: 0.11221842467784882, Test Loss: 0.08967240154743195\n", "Epoch 1299/10000, Training Loss: 0.112118199467659, Test Loss: 0.08960291743278503\n", "Epoch 1300/10000, Training Loss: 0.11201819032430649, Test Loss: 0.0895337462425232\n", "Epoch 1301/10000, Training Loss: 0.11191841214895248, Test Loss: 0.08946483582258224\n", "Epoch 1302/10000, Training Loss: 0.11181890219449997, Test Loss: 0.08939614146947861\n", "Epoch 1303/10000, Training Loss: 0.11171963065862656, Test Loss: 0.08932776004076004\n", "Epoch 1304/10000, Training Loss: 0.11162059754133224, Test Loss: 0.08925943821668625\n", "Epoch 1305/10000, Training Loss: 0.11152175813913345, Test Loss: 0.08919142186641693\n", "Epoch 1306/10000, Training Loss: 0.11142320185899734, Test Loss: 0.08912352472543716\n", "Epoch 1307/10000, Training Loss: 0.11132486909627914, Test Loss: 0.08905553072690964\n", "Epoch 1308/10000, Training Loss: 0.11122676730155945, Test Loss: 0.08898754417896271\n", "Epoch 1309/10000, Training Loss: 0.11112891137599945, Test Loss: 0.08891961723566055\n", "Epoch 1310/10000, Training Loss: 0.11103124916553497, Test Loss: 0.08885176479816437\n", "Epoch 1311/10000, Training Loss: 0.11093387752771378, Test Loss: 0.08878441154956818\n", "Epoch 1312/10000, Training Loss: 0.11083667725324631, Test Loss: 0.08871759474277496\n", "Epoch 1313/10000, Training Loss: 0.11073976010084152, Test Loss: 0.0886511579155922\n", "Epoch 1314/10000, Training Loss: 0.11064305901527405, Test Loss: 0.08858518302440643\n", "Epoch 1315/10000, Training Loss: 0.11054658144712448, Test Loss: 0.08851954340934753\n", "Epoch 1316/10000, Training Loss: 0.11045034974813461, Test Loss: 0.08845416456460953\n", "Epoch 1317/10000, Training Loss: 0.11035431176424026, Test Loss: 0.0883890762925148\n", "Epoch 1318/10000, Training Loss: 0.11025851219892502, Test Loss: 0.0883241668343544\n", "Epoch 1319/10000, Training Loss: 0.11016297340393066, Test Loss: 0.08825910836458206\n", "Epoch 1320/10000, Training Loss: 0.11006760597229004, Test Loss: 0.08819396793842316\n", "Epoch 1321/10000, Training Loss: 0.1099725216627121, Test Loss: 0.08812887221574783\n", "Epoch 1322/10000, Training Loss: 0.10987765341997147, Test Loss: 0.0880640372633934\n", "Epoch 1323/10000, Training Loss: 0.10978297144174576, Test Loss: 0.08799956738948822\n", "Epoch 1324/10000, Training Loss: 0.10968854278326035, Test Loss: 0.08793552219867706\n", "Epoch 1325/10000, Training Loss: 0.10959433019161224, Test Loss: 0.08787183463573456\n", "Epoch 1326/10000, Training Loss: 0.10950032621622086, Test Loss: 0.08780848234891891\n", "Epoch 1327/10000, Training Loss: 0.10940657556056976, Test Loss: 0.08774543553590775\n", "Epoch 1328/10000, Training Loss: 0.10931303352117538, Test Loss: 0.08768254518508911\n", "Epoch 1329/10000, Training Loss: 0.10921971499919891, Test Loss: 0.08761971443891525\n", "Epoch 1330/10000, Training Loss: 0.10912659764289856, Test Loss: 0.08755669742822647\n", "Epoch 1331/10000, Training Loss: 0.10903369635343552, Test Loss: 0.08749371767044067\n", "Epoch 1332/10000, Training Loss: 0.10894104093313217, Test Loss: 0.08743102848529816\n", "Epoch 1333/10000, Training Loss: 0.10884859412908554, Test Loss: 0.08736858516931534\n", "Epoch 1334/10000, Training Loss: 0.10875633358955383, Test Loss: 0.08730640262365341\n", "Epoch 1335/10000, Training Loss: 0.10866431891918182, Test Loss: 0.08724447339773178\n", "Epoch 1336/10000, Training Loss: 0.10857254266738892, Test Loss: 0.08718286454677582\n", "Epoch 1337/10000, Training Loss: 0.10848095268011093, Test Loss: 0.08712133020162582\n", "Epoch 1338/10000, Training Loss: 0.10838958621025085, Test Loss: 0.08705974370241165\n", "Epoch 1339/10000, Training Loss: 0.1082984060049057, Test Loss: 0.08699821680784225\n", "Epoch 1340/10000, Training Loss: 0.10820744931697845, Test Loss: 0.0869370549917221\n", "Epoch 1341/10000, Training Loss: 0.10811670869588852, Test Loss: 0.086876280605793\n", "Epoch 1342/10000, Training Loss: 0.10802621394395828, Test Loss: 0.08681587129831314\n", "Epoch 1343/10000, Training Loss: 0.10793590545654297, Test Loss: 0.08675573021173477\n", "Epoch 1344/10000, Training Loss: 0.10784581303596497, Test Loss: 0.08669586479663849\n", "Epoch 1345/10000, Training Loss: 0.10775592923164368, Test Loss: 0.08663591742515564\n", "Epoch 1346/10000, Training Loss: 0.10766623169183731, Test Loss: 0.0865759402513504\n", "Epoch 1347/10000, Training Loss: 0.10757678747177124, Test Loss: 0.08651599287986755\n", "Epoch 1348/10000, Training Loss: 0.10748752951622009, Test Loss: 0.08645626157522202\n", "Epoch 1349/10000, Training Loss: 0.10739848017692566, Test Loss: 0.08639681339263916\n", "Epoch 1350/10000, Training Loss: 0.10730962455272675, Test Loss: 0.08633765578269958\n", "Epoch 1351/10000, Training Loss: 0.10722097009420395, Test Loss: 0.08627870678901672\n", "Epoch 1352/10000, Training Loss: 0.10713254660367966, Test Loss: 0.08622004836797714\n", "Epoch 1353/10000, Training Loss: 0.1070442944765091, Test Loss: 0.08616139739751816\n", "Epoch 1354/10000, Training Loss: 0.10695628076791763, Test Loss: 0.08610270172357559\n", "Epoch 1355/10000, Training Loss: 0.1068684533238411, Test Loss: 0.0860443264245987\n", "Epoch 1356/10000, Training Loss: 0.10678085684776306, Test Loss: 0.08598612993955612\n", "Epoch 1357/10000, Training Loss: 0.10669341683387756, Test Loss: 0.08592800050973892\n", "Epoch 1358/10000, Training Loss: 0.10660622268915176, Test Loss: 0.08587019890546799\n", "Epoch 1359/10000, Training Loss: 0.1065191999077797, Test Loss: 0.08581261336803436\n", "Epoch 1360/10000, Training Loss: 0.10643240064382553, Test Loss: 0.08575500547885895\n", "Epoch 1361/10000, Training Loss: 0.10634580999612808, Test Loss: 0.08569756150245667\n", "Epoch 1362/10000, Training Loss: 0.10625939071178436, Test Loss: 0.08564036339521408\n", "Epoch 1363/10000, Training Loss: 0.10617316514253616, Test Loss: 0.08558350801467896\n", "Epoch 1364/10000, Training Loss: 0.10608714818954468, Test Loss: 0.08552685379981995\n", "Epoch 1365/10000, Training Loss: 0.1060013547539711, Test Loss: 0.08547040820121765\n", "Epoch 1366/10000, Training Loss: 0.10591571033000946, Test Loss: 0.08541402220726013\n", "Epoch 1367/10000, Training Loss: 0.10583030432462692, Test Loss: 0.08535755425691605\n", "Epoch 1368/10000, Training Loss: 0.1057450994849205, Test Loss: 0.08530112355947495\n", "Epoch 1369/10000, Training Loss: 0.10566006600856781, Test Loss: 0.08524499088525772\n", "Epoch 1370/10000, Training Loss: 0.10557522624731064, Test Loss: 0.08518913388252258\n", "Epoch 1371/10000, Training Loss: 0.10549058020114899, Test Loss: 0.08513357490301132\n", "Epoch 1372/10000, Training Loss: 0.10540613532066345, Test Loss: 0.08507821708917618\n", "Epoch 1373/10000, Training Loss: 0.10532187670469284, Test Loss: 0.08502313494682312\n", "Epoch 1374/10000, Training Loss: 0.10523778945207596, Test Loss: 0.08496797829866409\n", "Epoch 1375/10000, Training Loss: 0.10515394806861877, Test Loss: 0.08491281419992447\n", "Epoch 1376/10000, Training Loss: 0.10507025569677353, Test Loss: 0.08485765755176544\n", "Epoch 1377/10000, Training Loss: 0.10498681664466858, Test Loss: 0.08480275422334671\n", "Epoch 1378/10000, Training Loss: 0.10490347445011139, Test Loss: 0.08474814146757126\n", "Epoch 1379/10000, Training Loss: 0.1048203855752945, Test Loss: 0.0846937969326973\n", "Epoch 1380/10000, Training Loss: 0.10473749041557312, Test Loss: 0.08463975787162781\n", "Epoch 1381/10000, Training Loss: 0.1046547219157219, Test Loss: 0.08458573371171951\n", "Epoch 1382/10000, Training Loss: 0.10457219183444977, Test Loss: 0.08453171700239182\n", "Epoch 1383/10000, Training Loss: 0.10448984801769257, Test Loss: 0.08447775989770889\n", "Epoch 1384/10000, Training Loss: 0.10440767556428909, Test Loss: 0.08442409336566925\n", "Epoch 1385/10000, Training Loss: 0.10432570427656174, Test Loss: 0.08437074720859528\n", "Epoch 1386/10000, Training Loss: 0.10424390435218811, Test Loss: 0.08431769162416458\n", "Epoch 1387/10000, Training Loss: 0.10416232794523239, Test Loss: 0.0842648297548294\n", "Epoch 1388/10000, Training Loss: 0.10408085584640503, Test Loss: 0.08421210944652557\n", "Epoch 1389/10000, Training Loss: 0.10399960726499557, Test Loss: 0.08415932208299637\n", "Epoch 1390/10000, Training Loss: 0.10391861200332642, Test Loss: 0.08410641551017761\n", "Epoch 1391/10000, Training Loss: 0.10383772104978561, Test Loss: 0.08405353128910065\n", "Epoch 1392/10000, Training Loss: 0.10375702381134033, Test Loss: 0.08400062471628189\n", "Epoch 1393/10000, Training Loss: 0.10367650538682938, Test Loss: 0.08394815772771835\n", "Epoch 1394/10000, Training Loss: 0.10359620302915573, Test Loss: 0.08389592915773392\n", "Epoch 1395/10000, Training Loss: 0.10351604223251343, Test Loss: 0.0838441327214241\n", "Epoch 1396/10000, Training Loss: 0.10343608260154724, Test Loss: 0.08379266411066055\n", "Epoch 1397/10000, Training Loss: 0.10335628688335419, Test Loss: 0.0837414488196373\n", "Epoch 1398/10000, Training Loss: 0.10327666252851486, Test Loss: 0.08369015157222748\n", "Epoch 1399/10000, Training Loss: 0.10319724678993225, Test Loss: 0.08363886922597885\n", "Epoch 1400/10000, Training Loss: 0.10311800241470337, Test Loss: 0.08358753472566605\n", "Epoch 1401/10000, Training Loss: 0.10303895175457001, Test Loss: 0.08353637903928757\n", "Epoch 1402/10000, Training Loss: 0.10296004265546799, Test Loss: 0.0834854245185852\n", "Epoch 1403/10000, Training Loss: 0.1028813049197197, Test Loss: 0.08343472331762314\n", "Epoch 1404/10000, Training Loss: 0.10280276089906693, Test Loss: 0.08338410407304764\n", "Epoch 1405/10000, Training Loss: 0.10272439569234848, Test Loss: 0.08333345502614975\n", "Epoch 1406/10000, Training Loss: 0.10264623165130615, Test Loss: 0.08328291773796082\n", "Epoch 1407/10000, Training Loss: 0.10256822407245636, Test Loss: 0.0832323208451271\n", "Epoch 1408/10000, Training Loss: 0.10249035805463791, Test Loss: 0.08318208158016205\n", "Epoch 1409/10000, Training Loss: 0.10241270810365677, Test Loss: 0.08313217759132385\n", "Epoch 1410/10000, Training Loss: 0.10233519971370697, Test Loss: 0.08308257162570953\n", "Epoch 1411/10000, Training Loss: 0.10225789248943329, Test Loss: 0.08303297311067581\n", "Epoch 1412/10000, Training Loss: 0.10218075662851334, Test Loss: 0.08298356086015701\n", "Epoch 1413/10000, Training Loss: 0.10210378468036652, Test Loss: 0.08293437957763672\n", "Epoch 1414/10000, Training Loss: 0.10202695429325104, Test Loss: 0.08288543671369553\n", "Epoch 1415/10000, Training Loss: 0.10195036232471466, Test Loss: 0.08283673226833344\n", "Epoch 1416/10000, Training Loss: 0.10187385976314545, Test Loss: 0.08278791606426239\n", "Epoch 1417/10000, Training Loss: 0.10179758071899414, Test Loss: 0.08273906260728836\n", "Epoch 1418/10000, Training Loss: 0.10172143578529358, Test Loss: 0.08269012719392776\n", "Epoch 1419/10000, Training Loss: 0.1016455739736557, Test Loss: 0.08262795954942703\n", "Epoch 1420/10000, Training Loss: 0.10157006233930588, Test Loss: 0.08255580067634583\n", "Epoch 1421/10000, Training Loss: 0.10149472206830978, Test Loss: 0.08247805386781693\n", "Epoch 1422/10000, Training Loss: 0.10141954571008682, Test Loss: 0.0823991596698761\n", "Epoch 1423/10000, Training Loss: 0.10134455561637878, Test Loss: 0.08232373744249344\n", "Epoch 1424/10000, Training Loss: 0.10126976668834686, Test Loss: 0.08225519210100174\n", "Epoch 1425/10000, Training Loss: 0.10119520127773285, Test Loss: 0.08219589293003082\n", "Epoch 1426/10000, Training Loss: 0.10112076997756958, Test Loss: 0.08214660733938217\n", "Epoch 1427/10000, Training Loss: 0.10104647278785706, Test Loss: 0.08210621774196625\n", "Epoch 1428/10000, Training Loss: 0.10097230970859528, Test Loss: 0.08207286894321442\n", "Epoch 1429/10000, Training Loss: 0.10089833289384842, Test Loss: 0.08204362541437149\n", "Epoch 1430/10000, Training Loss: 0.1008245050907135, Test Loss: 0.08201555907726288\n", "Epoch 1431/10000, Training Loss: 0.10075084120035172, Test Loss: 0.08198573440313339\n", "Epoch 1432/10000, Training Loss: 0.10067736357450485, Test Loss: 0.08195172995328903\n", "Epoch 1433/10000, Training Loss: 0.10060404241085052, Test Loss: 0.08191205561161041\n", "Epoch 1434/10000, Training Loss: 0.10053091496229172, Test Loss: 0.0818662941455841\n", "Epoch 1435/10000, Training Loss: 0.10045790672302246, Test Loss: 0.08181485533714294\n", "Epoch 1436/10000, Training Loss: 0.10038505494594574, Test Loss: 0.08175894618034363\n", "Epoch 1437/10000, Training Loss: 0.10031240433454514, Test Loss: 0.0816996842622757\n", "Epoch 1438/10000, Training Loss: 0.10023990273475647, Test Loss: 0.0816391184926033\n", "Epoch 1439/10000, Training Loss: 0.10016752779483795, Test Loss: 0.08157922327518463\n", "Epoch 1440/10000, Training Loss: 0.10009533166885376, Test Loss: 0.08152167499065399\n", "Epoch 1441/10000, Training Loss: 0.1000232994556427, Test Loss: 0.08146797120571136\n", "Epoch 1442/10000, Training Loss: 0.09995145350694656, Test Loss: 0.08141865581274033\n", "Epoch 1443/10000, Training Loss: 0.09987970441579819, Test Loss: 0.08137380331754684\n", "Epoch 1444/10000, Training Loss: 0.09980814158916473, Test Loss: 0.08133326470851898\n", "Epoch 1445/10000, Training Loss: 0.09973674267530441, Test Loss: 0.08129584789276123\n", "Epoch 1446/10000, Training Loss: 0.09966549277305603, Test Loss: 0.0812600627541542\n", "Epoch 1447/10000, Training Loss: 0.09959441423416138, Test Loss: 0.08122459053993225\n", "Epoch 1448/10000, Training Loss: 0.09952345490455627, Test Loss: 0.08118783682584763\n", "Epoch 1449/10000, Training Loss: 0.0994526594877243, Test Loss: 0.08114878833293915\n", "Epoch 1450/10000, Training Loss: 0.09938203543424606, Test Loss: 0.08110638707876205\n", "Epoch 1451/10000, Training Loss: 0.09931156039237976, Test Loss: 0.08106062561273575\n", "Epoch 1452/10000, Training Loss: 0.099241241812706, Test Loss: 0.08101188391447067\n", "Epoch 1453/10000, Training Loss: 0.09917106479406357, Test Loss: 0.08096104115247726\n", "Epoch 1454/10000, Training Loss: 0.09910102933645248, Test Loss: 0.08090882003307343\n", "Epoch 1455/10000, Training Loss: 0.09903116524219513, Test Loss: 0.08085640519857407\n", "Epoch 1456/10000, Training Loss: 0.09896145761013031, Test Loss: 0.08080525696277618\n", "Epoch 1457/10000, Training Loss: 0.09889191389083862, Test Loss: 0.08075612783432007\n", "Epoch 1458/10000, Training Loss: 0.09882247447967529, Test Loss: 0.08070965856313705\n", "Epoch 1459/10000, Training Loss: 0.0987531915307045, Test Loss: 0.08066554367542267\n", "Epoch 1460/10000, Training Loss: 0.09868408739566803, Test Loss: 0.08062379062175751\n", "Epoch 1461/10000, Training Loss: 0.09861508011817932, Test Loss: 0.08058340847492218\n", "Epoch 1462/10000, Training Loss: 0.09854628890752792, Test Loss: 0.08054440468549728\n", "Epoch 1463/10000, Training Loss: 0.09847764670848846, Test Loss: 0.08050386607646942\n", "Epoch 1464/10000, Training Loss: 0.09840922802686691, Test Loss: 0.08046148717403412\n", "Epoch 1465/10000, Training Loss: 0.0983409732580185, Test Loss: 0.08041752129793167\n", "Epoch 1466/10000, Training Loss: 0.09827283769845963, Test Loss: 0.08037162572145462\n", "Epoch 1467/10000, Training Loss: 0.0982048511505127, Test Loss: 0.08032423257827759\n", "Epoch 1468/10000, Training Loss: 0.09813705086708069, Test Loss: 0.08027651906013489\n", "Epoch 1469/10000, Training Loss: 0.09806935489177704, Test Loss: 0.08022930473089218\n", "Epoch 1470/10000, Training Loss: 0.09800181537866592, Test Loss: 0.08018305152654648\n", "Epoch 1471/10000, Training Loss: 0.09793442487716675, Test Loss: 0.08013810962438583\n", "Epoch 1472/10000, Training Loss: 0.09786716103553772, Test Loss: 0.08009479939937592\n", "Epoch 1473/10000, Training Loss: 0.09780003875494003, Test Loss: 0.08005271852016449\n", "Epoch 1474/10000, Training Loss: 0.09773308038711548, Test Loss: 0.08001165837049484\n", "Epoch 1475/10000, Training Loss: 0.09766626358032227, Test Loss: 0.07997182011604309\n", "Epoch 1476/10000, Training Loss: 0.0975995734333992, Test Loss: 0.07993250340223312\n", "Epoch 1477/10000, Training Loss: 0.09753301739692688, Test Loss: 0.07989319413900375\n", "Epoch 1478/10000, Training Loss: 0.0974666103720665, Test Loss: 0.07985356450080872\n", "Epoch 1479/10000, Training Loss: 0.09740038216114044, Test Loss: 0.07981272041797638\n", "Epoch 1480/10000, Training Loss: 0.09733424335718155, Test Loss: 0.07977060973644257\n", "Epoch 1481/10000, Training Loss: 0.09726826846599579, Test Loss: 0.07972803711891174\n", "Epoch 1482/10000, Training Loss: 0.09720241278409958, Test Loss: 0.07968505471944809\n", "Epoch 1483/10000, Training Loss: 0.09713669121265411, Test Loss: 0.07964222133159637\n", "Epoch 1484/10000, Training Loss: 0.09707114845514297, Test Loss: 0.07959973067045212\n", "Epoch 1485/10000, Training Loss: 0.097005695104599, Test Loss: 0.07955772429704666\n", "Epoch 1486/10000, Training Loss: 0.09694037586450577, Test Loss: 0.0795159637928009\n", "Epoch 1487/10000, Training Loss: 0.09687525779008865, Test Loss: 0.07947419583797455\n", "Epoch 1488/10000, Training Loss: 0.09681025892496109, Test Loss: 0.0794324204325676\n", "Epoch 1489/10000, Training Loss: 0.09674534201622009, Test Loss: 0.0793912410736084\n", "Epoch 1490/10000, Training Loss: 0.09668060392141342, Test Loss: 0.07935073226690292\n", "Epoch 1491/10000, Training Loss: 0.0966159999370575, Test Loss: 0.07931097596883774\n", "Epoch 1492/10000, Training Loss: 0.09655153006315231, Test Loss: 0.07927162200212479\n", "Epoch 1493/10000, Training Loss: 0.09648717194795609, Test Loss: 0.07923267036676407\n", "Epoch 1494/10000, Training Loss: 0.0964229553937912, Test Loss: 0.07919320464134216\n", "Epoch 1495/10000, Training Loss: 0.09635892510414124, Test Loss: 0.07915319502353668\n", "Epoch 1496/10000, Training Loss: 0.09629493951797485, Test Loss: 0.07911311089992523\n", "Epoch 1497/10000, Training Loss: 0.0962311327457428, Test Loss: 0.07907330244779587\n", "Epoch 1498/10000, Training Loss: 0.09616749733686447, Test Loss: 0.079033762216568\n", "Epoch 1499/10000, Training Loss: 0.09610393643379211, Test Loss: 0.07899407297372818\n", "Epoch 1500/10000, Training Loss: 0.0960405021905899, Test Loss: 0.07895433902740479\n", "Epoch 1501/10000, Training Loss: 0.09597723931074142, Test Loss: 0.07891461253166199\n", "Epoch 1502/10000, Training Loss: 0.0959140807390213, Test Loss: 0.07887491583824158\n", "Epoch 1503/10000, Training Loss: 0.09585107862949371, Test Loss: 0.0788348838686943\n", "Epoch 1504/10000, Training Loss: 0.09578815847635269, Test Loss: 0.07879488170146942\n", "Epoch 1505/10000, Training Loss: 0.09572542458772659, Test Loss: 0.0787554606795311\n", "Epoch 1506/10000, Training Loss: 0.09566278755664825, Test Loss: 0.07871656864881516\n", "Epoch 1507/10000, Training Loss: 0.09560029208660126, Test Loss: 0.07867802679538727\n", "Epoch 1508/10000, Training Loss: 0.09553791582584381, Test Loss: 0.07863985747098923\n", "Epoch 1509/10000, Training Loss: 0.0954756960272789, Test Loss: 0.0786018967628479\n", "Epoch 1510/10000, Training Loss: 0.09541361778974533, Test Loss: 0.07856430858373642\n", "Epoch 1511/10000, Training Loss: 0.09535160660743713, Test Loss: 0.07852645218372345\n", "Epoch 1512/10000, Training Loss: 0.09528976678848267, Test Loss: 0.07848858088254929\n", "Epoch 1513/10000, Training Loss: 0.09522804617881775, Test Loss: 0.07845064997673035\n", "Epoch 1514/10000, Training Loss: 0.0951664000749588, Test Loss: 0.07841263711452484\n", "Epoch 1515/10000, Training Loss: 0.09510491043329239, Test Loss: 0.07837416231632233\n", "Epoch 1516/10000, Training Loss: 0.09504358470439911, Test Loss: 0.07833588123321533\n", "Epoch 1517/10000, Training Loss: 0.0949823185801506, Test Loss: 0.07829777151346207\n", "Epoch 1518/10000, Training Loss: 0.09492123126983643, Test Loss: 0.07826022058725357\n", "Epoch 1519/10000, Training Loss: 0.0948602631688118, Test Loss: 0.07822272181510925\n", "Epoch 1520/10000, Training Loss: 0.09479942172765732, Test Loss: 0.07818496972322464\n", "Epoch 1521/10000, Training Loss: 0.0947386771440506, Test Loss: 0.07814747840166092\n", "Epoch 1522/10000, Training Loss: 0.09467805176973343, Test Loss: 0.07811052352190018\n", "Epoch 1523/10000, Training Loss: 0.0946175754070282, Test Loss: 0.07807374000549316\n", "Epoch 1524/10000, Training Loss: 0.09455722570419312, Test Loss: 0.078037329018116\n", "Epoch 1525/10000, Training Loss: 0.09449701011180878, Test Loss: 0.07800104469060898\n", "Epoch 1526/10000, Training Loss: 0.0944368839263916, Test Loss: 0.07796421647071838\n", "Epoch 1527/10000, Training Loss: 0.09437689185142517, Test Loss: 0.07792747765779495\n", "Epoch 1528/10000, Training Loss: 0.09431702643632889, Test Loss: 0.0778907984495163\n", "Epoch 1529/10000, Training Loss: 0.09425726532936096, Test Loss: 0.0778544470667839\n", "Epoch 1530/10000, Training Loss: 0.09419766068458557, Test Loss: 0.07781839370727539\n", "Epoch 1531/10000, Training Loss: 0.09413809329271317, Test Loss: 0.07778184860944748\n", "Epoch 1532/10000, Training Loss: 0.09407875686883926, Test Loss: 0.07774540781974792\n", "Epoch 1533/10000, Training Loss: 0.09401946514844894, Test Loss: 0.07770911604166031\n", "Epoch 1534/10000, Training Loss: 0.09396035224199295, Test Loss: 0.07767310738563538\n", "Epoch 1535/10000, Training Loss: 0.09390129148960114, Test Loss: 0.0776374414563179\n", "Epoch 1536/10000, Training Loss: 0.09384239464998245, Test Loss: 0.07760220766067505\n", "Epoch 1537/10000, Training Loss: 0.09378357231616974, Test Loss: 0.07756698131561279\n", "Epoch 1538/10000, Training Loss: 0.09372489899396896, Test Loss: 0.07753130048513412\n", "Epoch 1539/10000, Training Loss: 0.09366633743047714, Test Loss: 0.07749561965465546\n", "Epoch 1540/10000, Training Loss: 0.09360792487859726, Test Loss: 0.07746008783578873\n", "Epoch 1541/10000, Training Loss: 0.09354959428310394, Test Loss: 0.07742469012737274\n", "Epoch 1542/10000, Training Loss: 0.09349136054515839, Test Loss: 0.07738972455263138\n", "Epoch 1543/10000, Training Loss: 0.09343329071998596, Test Loss: 0.07735508680343628\n", "Epoch 1544/10000, Training Loss: 0.0933753177523613, Test Loss: 0.07732021808624268\n", "Epoch 1545/10000, Training Loss: 0.09331746399402618, Test Loss: 0.07728538662195206\n", "Epoch 1546/10000, Training Loss: 0.09325970709323883, Test Loss: 0.07725059241056442\n", "Epoch 1547/10000, Training Loss: 0.09320209175348282, Test Loss: 0.07721588015556335\n", "Epoch 1548/10000, Training Loss: 0.09314456582069397, Test Loss: 0.07718119025230408\n", "Epoch 1549/10000, Training Loss: 0.09308718144893646, Test Loss: 0.07714658230543137\n", "Epoch 1550/10000, Training Loss: 0.09302989393472672, Test Loss: 0.07711208611726761\n", "Epoch 1551/10000, Training Loss: 0.09297273308038712, Test Loss: 0.07707726210355759\n", "Epoch 1552/10000, Training Loss: 0.09291567653417587, Test Loss: 0.07704293727874756\n", "Epoch 1553/10000, Training Loss: 0.0928587093949318, Test Loss: 0.07700906693935394\n", "Epoch 1554/10000, Training Loss: 0.09280189871788025, Test Loss: 0.07697564363479614\n", "Epoch 1555/10000, Training Loss: 0.09274516254663467, Test Loss: 0.07694226503372192\n", "Epoch 1556/10000, Training Loss: 0.09268856793642044, Test Loss: 0.07690891623497009\n", "Epoch 1557/10000, Training Loss: 0.09263204783201218, Test Loss: 0.07687551528215408\n", "Epoch 1558/10000, Training Loss: 0.09257568418979645, Test Loss: 0.07684201002120972\n", "Epoch 1559/10000, Training Loss: 0.09251940995454788, Test Loss: 0.0768079087138176\n", "Epoch 1560/10000, Training Loss: 0.09246326982975006, Test Loss: 0.07677383720874786\n", "Epoch 1561/10000, Training Loss: 0.0924072340130806, Test Loss: 0.07673991471529007\n", "Epoch 1562/10000, Training Loss: 0.09235130250453949, Test Loss: 0.07670655101537704\n", "Epoch 1563/10000, Training Loss: 0.09229545295238495, Test Loss: 0.07667361199855804\n", "Epoch 1564/10000, Training Loss: 0.09223973751068115, Test Loss: 0.07664106041193008\n", "Epoch 1565/10000, Training Loss: 0.0921841561794281, Test Loss: 0.0766085609793663\n", "Epoch 1566/10000, Training Loss: 0.09212861955165863, Test Loss: 0.07657606899738312\n", "Epoch 1567/10000, Training Loss: 0.0920732393860817, Test Loss: 0.07653329521417618\n", "Epoch 1568/10000, Training Loss: 0.0920180156826973, Test Loss: 0.0764833614230156\n", "Epoch 1569/10000, Training Loss: 0.09196288883686066, Test Loss: 0.07642956078052521\n", "Epoch 1570/10000, Training Loss: 0.09190783649682999, Test Loss: 0.07637549936771393\n", "Epoch 1571/10000, Training Loss: 0.09185295552015305, Test Loss: 0.07632463425397873\n", "Epoch 1572/10000, Training Loss: 0.09179815649986267, Test Loss: 0.0762789249420166\n", "Epoch 1573/10000, Training Loss: 0.09174345433712006, Test Loss: 0.0762396976351738\n", "Epoch 1574/10000, Training Loss: 0.09168891608715057, Test Loss: 0.07620776444673538\n", "Epoch 1575/10000, Training Loss: 0.09163442999124527, Test Loss: 0.07618188858032227\n", "Epoch 1576/10000, Training Loss: 0.09158001840114594, Test Loss: 0.0761604756116867\n", "Epoch 1577/10000, Training Loss: 0.09152572602033615, Test Loss: 0.07614198327064514\n", "Epoch 1578/10000, Training Loss: 0.09147156029939651, Test Loss: 0.07612395286560059\n", "Epoch 1579/10000, Training Loss: 0.09141748398542404, Test Loss: 0.0761038288474083\n", "Epoch 1580/10000, Training Loss: 0.09136355668306351, Test Loss: 0.07608039677143097\n", "Epoch 1581/10000, Training Loss: 0.09130971133708954, Test Loss: 0.07605279982089996\n", "Epoch 1582/10000, Training Loss: 0.09125594794750214, Test Loss: 0.0760202631354332\n", "Epoch 1583/10000, Training Loss: 0.09120229631662369, Test Loss: 0.07598385959863663\n", "Epoch 1584/10000, Training Loss: 0.09114877879619598, Test Loss: 0.07594471424818039\n", "Epoch 1585/10000, Training Loss: 0.09109533578157425, Test Loss: 0.07590422034263611\n", "Epoch 1586/10000, Training Loss: 0.09104199707508087, Test Loss: 0.07586368173360825\n", "Epoch 1587/10000, Training Loss: 0.09098877757787704, Test Loss: 0.07583387941122055\n", "Epoch 1588/10000, Training Loss: 0.09093566238880157, Test Loss: 0.07581359148025513\n", "Epoch 1589/10000, Training Loss: 0.09088265895843506, Test Loss: 0.075800321996212\n", "Epoch 1590/10000, Training Loss: 0.09082973748445511, Test Loss: 0.07578123360872269\n", "Epoch 1591/10000, Training Loss: 0.09077693521976471, Test Loss: 0.07575522363185883\n", "Epoch 1592/10000, Training Loss: 0.09072421491146088, Test Loss: 0.07572206854820251\n", "Epoch 1593/10000, Training Loss: 0.09067162126302719, Test Loss: 0.07568291574716568\n", "Epoch 1594/10000, Training Loss: 0.09061910957098007, Test Loss: 0.07563933730125427\n", "Epoch 1595/10000, Training Loss: 0.09056667238473892, Test Loss: 0.07560279965400696\n", "Epoch 1596/10000, Training Loss: 0.0905143991112709, Test Loss: 0.07557389885187149\n", "Epoch 1597/10000, Training Loss: 0.09046217799186707, Test Loss: 0.07555190473794937\n", "Epoch 1598/10000, Training Loss: 0.09041007608175278, Test Loss: 0.0755254253745079\n", "Epoch 1599/10000, Training Loss: 0.09035807102918625, Test Loss: 0.07549426704645157\n", "Epoch 1600/10000, Training Loss: 0.09030620008707047, Test Loss: 0.07545910030603409\n", "Epoch 1601/10000, Training Loss: 0.09025438129901886, Test Loss: 0.07542096078395844\n", "Epoch 1602/10000, Training Loss: 0.09020266681909561, Test Loss: 0.07538151741027832\n", "Epoch 1603/10000, Training Loss: 0.09015105664730072, Test Loss: 0.07534189522266388\n", "Epoch 1604/10000, Training Loss: 0.09009955823421478, Test Loss: 0.07531262189149857\n", "Epoch 1605/10000, Training Loss: 0.0900481566786766, Test Loss: 0.07528389245271683\n", "Epoch 1606/10000, Training Loss: 0.08999684453010559, Test Loss: 0.07525551319122314\n", "Epoch 1607/10000, Training Loss: 0.08994565904140472, Test Loss: 0.07522733509540558\n", "Epoch 1608/10000, Training Loss: 0.08989454060792923, Test Loss: 0.07519892603158951\n", "Epoch 1609/10000, Training Loss: 0.08984348922967911, Test Loss: 0.07516994327306747\n", "Epoch 1610/10000, Training Loss: 0.08979261666536331, Test Loss: 0.07514014840126038\n", "Epoch 1611/10000, Training Loss: 0.0897417664527893, Test Loss: 0.07511014491319656\n", "Epoch 1612/10000, Training Loss: 0.08969105780124664, Test Loss: 0.07508008927106857\n", "Epoch 1613/10000, Training Loss: 0.08964041620492935, Test Loss: 0.07505904138088226\n", "Epoch 1614/10000, Training Loss: 0.08958990126848221, Test Loss: 0.07503589242696762\n", "Epoch 1615/10000, Training Loss: 0.08953948318958282, Test Loss: 0.07500986009836197\n", "Epoch 1616/10000, Training Loss: 0.08948912471532822, Test Loss: 0.07498066872358322\n", "Epoch 1617/10000, Training Loss: 0.08943887799978256, Test Loss: 0.07494853436946869\n", "Epoch 1618/10000, Training Loss: 0.08938872814178467, Test Loss: 0.07491452246904373\n", "Epoch 1619/10000, Training Loss: 0.08933866769075394, Test Loss: 0.07487960159778595\n", "Epoch 1620/10000, Training Loss: 0.08928874135017395, Test Loss: 0.07485372573137283\n", "Epoch 1621/10000, Training Loss: 0.08923884481191635, Test Loss: 0.07482661306858063\n", "Epoch 1622/10000, Training Loss: 0.0891890823841095, Test Loss: 0.07479819655418396\n", "Epoch 1623/10000, Training Loss: 0.0891394168138504, Test Loss: 0.07476867735385895\n", "Epoch 1624/10000, Training Loss: 0.08908982574939728, Test Loss: 0.0747378021478653\n", "Epoch 1625/10000, Training Loss: 0.08904030919075012, Test Loss: 0.07470648735761642\n", "Epoch 1626/10000, Training Loss: 0.0889909565448761, Test Loss: 0.07467488199472427\n", "Epoch 1627/10000, Training Loss: 0.08894167840480804, Test Loss: 0.07465270161628723\n", "Epoch 1628/10000, Training Loss: 0.08889240026473999, Test Loss: 0.07462936639785767\n", "Epoch 1629/10000, Training Loss: 0.08884330838918686, Test Loss: 0.07460449635982513\n", "Epoch 1630/10000, Training Loss: 0.08879426121711731, Test Loss: 0.07457759976387024\n", "Epoch 1631/10000, Training Loss: 0.0887453481554985, Test Loss: 0.07454859465360641\n", "Epoch 1632/10000, Training Loss: 0.08869650959968567, Test Loss: 0.07451773434877396\n", "Epoch 1633/10000, Training Loss: 0.08864776045084, Test Loss: 0.07448533922433853\n", "Epoch 1634/10000, Training Loss: 0.0885990709066391, Test Loss: 0.07445292919874191\n", "Epoch 1635/10000, Training Loss: 0.08855051547288895, Test Loss: 0.07442105561494827\n", "Epoch 1636/10000, Training Loss: 0.08850201964378357, Test Loss: 0.07439052313566208\n", "Epoch 1637/10000, Training Loss: 0.08845365792512894, Test Loss: 0.07436151057481766\n", "Epoch 1638/10000, Training Loss: 0.08840534090995789, Test Loss: 0.07433413714170456\n", "Epoch 1639/10000, Training Loss: 0.0883571207523346, Test Loss: 0.07430839538574219\n", "Epoch 1640/10000, Training Loss: 0.08830902725458145, Test Loss: 0.07428348809480667\n", "Epoch 1641/10000, Training Loss: 0.08826097846031189, Test Loss: 0.07425887137651443\n", "Epoch 1642/10000, Training Loss: 0.08821304142475128, Test Loss: 0.07423452287912369\n", "Epoch 1643/10000, Training Loss: 0.08816520869731903, Test Loss: 0.07421001046895981\n", "Epoch 1644/10000, Training Loss: 0.08811742067337036, Test Loss: 0.07418499141931534\n", "Epoch 1645/10000, Training Loss: 0.08806978166103363, Test Loss: 0.07415922731161118\n", "Epoch 1646/10000, Training Loss: 0.08802218735218048, Test Loss: 0.074132539331913\n", "Epoch 1647/10000, Training Loss: 0.0879746824502945, Test Loss: 0.07410506904125214\n", "Epoch 1648/10000, Training Loss: 0.08792728185653687, Test Loss: 0.0740768164396286\n", "Epoch 1649/10000, Training Loss: 0.0878799706697464, Test Loss: 0.07404809445142746\n", "Epoch 1650/10000, Training Loss: 0.08783268183469772, Test Loss: 0.07401952892541885\n", "Epoch 1651/10000, Training Loss: 0.08778554201126099, Test Loss: 0.07399140298366547\n", "Epoch 1652/10000, Training Loss: 0.08773849904537201, Test Loss: 0.07396400719881058\n", "Epoch 1653/10000, Training Loss: 0.08769155293703079, Test Loss: 0.07393737882375717\n", "Epoch 1654/10000, Training Loss: 0.08764464408159256, Test Loss: 0.07391156256198883\n", "Epoch 1655/10000, Training Loss: 0.08759782463312149, Test Loss: 0.07388635724782944\n", "Epoch 1656/10000, Training Loss: 0.08755112439393997, Test Loss: 0.07386156171560287\n", "Epoch 1657/10000, Training Loss: 0.08750449120998383, Test Loss: 0.07383649796247482\n", "Epoch 1658/10000, Training Loss: 0.08745793998241425, Test Loss: 0.07381106913089752\n", "Epoch 1659/10000, Training Loss: 0.08741147071123123, Test Loss: 0.07378566265106201\n", "Epoch 1660/10000, Training Loss: 0.08736506849527359, Test Loss: 0.0737602710723877\n", "Epoch 1661/10000, Training Loss: 0.0873187854886055, Test Loss: 0.07373476028442383\n", "Epoch 1662/10000, Training Loss: 0.08727259188890457, Test Loss: 0.07370918989181519\n", "Epoch 1663/10000, Training Loss: 0.08722645044326782, Test Loss: 0.07368376851081848\n", "Epoch 1664/10000, Training Loss: 0.08718044310808182, Test Loss: 0.0736583024263382\n", "Epoch 1665/10000, Training Loss: 0.0871344730257988, Test Loss: 0.07363220304250717\n", "Epoch 1666/10000, Training Loss: 0.08708860725164413, Test Loss: 0.0736047700047493\n", "Epoch 1667/10000, Training Loss: 0.08704286068677902, Test Loss: 0.07357664406299591\n", "Epoch 1668/10000, Training Loss: 0.08699723333120346, Test Loss: 0.07354813814163208\n", "Epoch 1669/10000, Training Loss: 0.08695162832736969, Test Loss: 0.07351930439472198\n", "Epoch 1670/10000, Training Loss: 0.08690610527992249, Test Loss: 0.07349128276109695\n", "Epoch 1671/10000, Training Loss: 0.08686071634292603, Test Loss: 0.07346437871456146\n", "Epoch 1672/10000, Training Loss: 0.08681536465883255, Test Loss: 0.07343877106904984\n", "Epoch 1673/10000, Training Loss: 0.08677010238170624, Test Loss: 0.07341399788856506\n", "Epoch 1674/10000, Training Loss: 0.08672493696212769, Test Loss: 0.073390431702137\n", "Epoch 1675/10000, Training Loss: 0.0866798684000969, Test Loss: 0.07336767762899399\n", "Epoch 1676/10000, Training Loss: 0.08663487434387207, Test Loss: 0.0733453780412674\n", "Epoch 1677/10000, Training Loss: 0.08658993989229202, Test Loss: 0.0733228251338005\n", "Epoch 1678/10000, Training Loss: 0.08654507249593735, Test Loss: 0.07329992949962616\n", "Epoch 1679/10000, Training Loss: 0.08650031685829163, Test Loss: 0.07327641546726227\n", "Epoch 1680/10000, Training Loss: 0.08645559102296829, Test Loss: 0.07325223833322525\n", "Epoch 1681/10000, Training Loss: 0.0864110141992569, Test Loss: 0.07322733104228973\n", "Epoch 1682/10000, Training Loss: 0.08636650443077087, Test Loss: 0.07320144027471542\n", "Epoch 1683/10000, Training Loss: 0.08632205426692963, Test Loss: 0.07317528873682022\n", "Epoch 1684/10000, Training Loss: 0.08627767860889435, Test Loss: 0.07314923405647278\n", "Epoch 1685/10000, Training Loss: 0.08623337000608444, Test Loss: 0.07312366366386414\n", "Epoch 1686/10000, Training Loss: 0.08618918061256409, Test Loss: 0.07309813052415848\n", "Epoch 1687/10000, Training Loss: 0.08614505082368851, Test Loss: 0.07307339459657669\n", "Epoch 1688/10000, Training Loss: 0.0861009955406189, Test Loss: 0.07304935157299042\n", "Epoch 1689/10000, Training Loss: 0.08605704456567764, Test Loss: 0.0730258971452713\n", "Epoch 1690/10000, Training Loss: 0.08601314574480057, Test Loss: 0.07300291210412979\n", "Epoch 1691/10000, Training Loss: 0.08596931397914886, Test Loss: 0.07298025488853455\n", "Epoch 1692/10000, Training Loss: 0.08592555671930313, Test Loss: 0.07295721024274826\n", "Epoch 1693/10000, Training Loss: 0.08588194847106934, Test Loss: 0.07293367385864258\n", "Epoch 1694/10000, Training Loss: 0.08583834022283554, Test Loss: 0.07291033118963242\n", "Epoch 1695/10000, Training Loss: 0.0857948362827301, Test Loss: 0.07288689911365509\n", "Epoch 1696/10000, Training Loss: 0.08575140684843063, Test Loss: 0.07286337018013\n", "Epoch 1697/10000, Training Loss: 0.08570806682109833, Test Loss: 0.07283975183963776\n", "Epoch 1698/10000, Training Loss: 0.0856647863984108, Test Loss: 0.07281605899333954\n", "Epoch 1699/10000, Training Loss: 0.08562161773443222, Test Loss: 0.07279229164123535\n", "Epoch 1700/10000, Training Loss: 0.08557846397161484, Test Loss: 0.07276861369609833\n", "Epoch 1701/10000, Training Loss: 0.08553541451692581, Test Loss: 0.0727449581027031\n", "Epoch 1702/10000, Training Loss: 0.08549246191978455, Test Loss: 0.07272166758775711\n", "Epoch 1703/10000, Training Loss: 0.08544958382844925, Test Loss: 0.07268751412630081\n", "Epoch 1704/10000, Training Loss: 0.08540678769350052, Test Loss: 0.07264592498540878\n", "Epoch 1705/10000, Training Loss: 0.08536407351493835, Test Loss: 0.07260053604841232\n", "Epoch 1706/10000, Training Loss: 0.08532147109508514, Test Loss: 0.07255489379167557\n", "Epoch 1707/10000, Training Loss: 0.0852789655327797, Test Loss: 0.07251258194446564\n", "Epoch 1708/10000, Training Loss: 0.08523651212453842, Test Loss: 0.07247688621282578\n", "Epoch 1709/10000, Training Loss: 0.08519412577152252, Test Loss: 0.07244905829429626\n", "Epoch 1710/10000, Training Loss: 0.08515181392431259, Test Loss: 0.07242875546216965\n", "Epoch 1711/10000, Training Loss: 0.08510956168174744, Test Loss: 0.07241469621658325\n", "Epoch 1712/10000, Training Loss: 0.08506739139556885, Test Loss: 0.07240486145019531\n", "Epoch 1713/10000, Training Loss: 0.08502529561519623, Test Loss: 0.07239678502082825\n", "Epoch 1714/10000, Training Loss: 0.08498325198888779, Test Loss: 0.07238797098398209\n", "Epoch 1715/10000, Training Loss: 0.08494127541780472, Test Loss: 0.07237660139799118\n", "Epoch 1716/10000, Training Loss: 0.0848994329571724, Test Loss: 0.07236114144325256\n", "Epoch 1717/10000, Training Loss: 0.08485764265060425, Test Loss: 0.07234077900648117\n", "Epoch 1718/10000, Training Loss: 0.08481592684984207, Test Loss: 0.07231569290161133\n", "Epoch 1719/10000, Training Loss: 0.08477424830198288, Test Loss: 0.07228667289018631\n", "Epoch 1720/10000, Training Loss: 0.08473268151283264, Test Loss: 0.07225467264652252\n", "Epoch 1721/10000, Training Loss: 0.08469115942716599, Test Loss: 0.07222159951925278\n", "Epoch 1722/10000, Training Loss: 0.0846496969461441, Test Loss: 0.07218915969133377\n", "Epoch 1723/10000, Training Loss: 0.08460830897092819, Test Loss: 0.07215945422649384\n", "Epoch 1724/10000, Training Loss: 0.08456702530384064, Test Loss: 0.072133369743824\n", "Epoch 1725/10000, Training Loss: 0.08452583849430084, Test Loss: 0.0721110999584198\n", "Epoch 1726/10000, Training Loss: 0.08448469638824463, Test Loss: 0.07209223508834839\n", "Epoch 1727/10000, Training Loss: 0.084443598985672, Test Loss: 0.07207551598548889\n", "Epoch 1728/10000, Training Loss: 0.08440258353948593, Test Loss: 0.07205989956855774\n", "Epoch 1729/10000, Training Loss: 0.08436166495084763, Test Loss: 0.07204417139291763\n", "Epoch 1730/10000, Training Loss: 0.0843208059668541, Test Loss: 0.07203733921051025\n", "Epoch 1731/10000, Training Loss: 0.08428000658750534, Test Loss: 0.07202596962451935\n", "Epoch 1732/10000, Training Loss: 0.08423928916454315, Test Loss: 0.07200940698385239\n", "Epoch 1733/10000, Training Loss: 0.08419863134622574, Test Loss: 0.07198718190193176\n", "Epoch 1734/10000, Training Loss: 0.08415805548429489, Test Loss: 0.07196003198623657\n", "Epoch 1735/10000, Training Loss: 0.08411751687526703, Test Loss: 0.07192928344011307\n", "Epoch 1736/10000, Training Loss: 0.0840771272778511, Test Loss: 0.07190664112567902\n", "Epoch 1737/10000, Training Loss: 0.08403674513101578, Test Loss: 0.07189158350229263\n", "Epoch 1738/10000, Training Loss: 0.08399643003940582, Test Loss: 0.07188218832015991\n", "Epoch 1739/10000, Training Loss: 0.08395619690418243, Test Loss: 0.07186614722013474\n", "Epoch 1740/10000, Training Loss: 0.08391600847244263, Test Loss: 0.07184311002492905\n", "Epoch 1741/10000, Training Loss: 0.08387594670057297, Test Loss: 0.07181363552808762\n", "Epoch 1742/10000, Training Loss: 0.08383594453334808, Test Loss: 0.07177972048521042\n", "Epoch 1743/10000, Training Loss: 0.0837959572672844, Test Loss: 0.07174383103847504\n", "Epoch 1744/10000, Training Loss: 0.08375606685876846, Test Loss: 0.07170900702476501\n", "Epoch 1745/10000, Training Loss: 0.0837162584066391, Test Loss: 0.07168728858232498\n", "Epoch 1746/10000, Training Loss: 0.08367650955915451, Test Loss: 0.07166764885187149\n", "Epoch 1747/10000, Training Loss: 0.0836368277668953, Test Loss: 0.07165690511465073\n", "Epoch 1748/10000, Training Loss: 0.08359722793102264, Test Loss: 0.07165256887674332\n", "Epoch 1749/10000, Training Loss: 0.08355767279863358, Test Loss: 0.07164356112480164\n", "Epoch 1750/10000, Training Loss: 0.08351821452379227, Test Loss: 0.07162823528051376\n", "Epoch 1751/10000, Training Loss: 0.08347880840301514, Test Loss: 0.07160653918981552\n", "Epoch 1752/10000, Training Loss: 0.08343945443630219, Test Loss: 0.07157967239618301\n", "Epoch 1753/10000, Training Loss: 0.0834001824259758, Test Loss: 0.07155637443065643\n", "Epoch 1754/10000, Training Loss: 0.08336097747087479, Test Loss: 0.07153672724962234\n", "Epoch 1755/10000, Training Loss: 0.08332183957099915, Test Loss: 0.07152014970779419\n", "Epoch 1756/10000, Training Loss: 0.08328276127576828, Test Loss: 0.0715058222413063\n", "Epoch 1757/10000, Training Loss: 0.08324375748634338, Test Loss: 0.07149244099855423\n", "Epoch 1758/10000, Training Loss: 0.08320481330156326, Test Loss: 0.07147162407636642\n", "Epoch 1759/10000, Training Loss: 0.08316593617200851, Test Loss: 0.07144412398338318\n", "Epoch 1760/10000, Training Loss: 0.08312713354825974, Test Loss: 0.07141896337270737\n", "Epoch 1761/10000, Training Loss: 0.08308838307857513, Test Loss: 0.07139670848846436\n", "Epoch 1762/10000, Training Loss: 0.0830497071146965, Test Loss: 0.0713769719004631\n", "Epoch 1763/10000, Training Loss: 0.08301109820604324, Test Loss: 0.07135233283042908\n", "Epoch 1764/10000, Training Loss: 0.08297254890203476, Test Loss: 0.07132405042648315\n", "Epoch 1765/10000, Training Loss: 0.08293404430150986, Test Loss: 0.07130132615566254\n", "Epoch 1766/10000, Training Loss: 0.08289564400911331, Test Loss: 0.07128410786390305\n", "Epoch 1767/10000, Training Loss: 0.08285727351903915, Test Loss: 0.0712643563747406\n", "Epoch 1768/10000, Training Loss: 0.08281898498535156, Test Loss: 0.0712423324584961\n", "Epoch 1769/10000, Training Loss: 0.08278077095746994, Test Loss: 0.07121863216161728\n", "Epoch 1770/10000, Training Loss: 0.08274263143539429, Test Loss: 0.07119397819042206\n", "Epoch 1771/10000, Training Loss: 0.08270451426506042, Test Loss: 0.07116929441690445\n", "Epoch 1772/10000, Training Loss: 0.08266646414995193, Test Loss: 0.07115251570940018\n", "Epoch 1773/10000, Training Loss: 0.08262846618890762, Test Loss: 0.07114238291978836\n", "Epoch 1774/10000, Training Loss: 0.08259055763483047, Test Loss: 0.07112912833690643\n", "Epoch 1775/10000, Training Loss: 0.0825527086853981, Test Loss: 0.07111204415559769\n", "Epoch 1776/10000, Training Loss: 0.0825149267911911, Test Loss: 0.0710911825299263\n", "Epoch 1777/10000, Training Loss: 0.08247720450162888, Test Loss: 0.07106703519821167\n", "Epoch 1778/10000, Training Loss: 0.08243951946496964, Test Loss: 0.07104122638702393\n", "Epoch 1779/10000, Training Loss: 0.08240194618701935, Test Loss: 0.07102194428443909\n", "Epoch 1780/10000, Training Loss: 0.08236438035964966, Test Loss: 0.0710086077451706\n", "Epoch 1781/10000, Training Loss: 0.08232690393924713, Test Loss: 0.070992611348629\n", "Epoch 1782/10000, Training Loss: 0.08228950202465057, Test Loss: 0.07097378373146057\n", "Epoch 1783/10000, Training Loss: 0.0822521522641182, Test Loss: 0.07095225900411606\n", "Epoch 1784/10000, Training Loss: 0.08221487700939178, Test Loss: 0.07092869281768799\n", "Epoch 1785/10000, Training Loss: 0.08217760920524597, Test Loss: 0.0709109902381897\n", "Epoch 1786/10000, Training Loss: 0.08214046061038971, Test Loss: 0.07089827954769135\n", "Epoch 1787/10000, Training Loss: 0.08210332691669464, Test Loss: 0.07088224589824677\n", "Epoch 1788/10000, Training Loss: 0.08206627517938614, Test Loss: 0.07086260616779327\n", "Epoch 1789/10000, Training Loss: 0.0820292979478836, Test Loss: 0.07083949446678162\n", "Epoch 1790/10000, Training Loss: 0.08199236541986465, Test Loss: 0.07082074880599976\n", "Epoch 1791/10000, Training Loss: 0.08195549994707108, Test Loss: 0.07080593705177307\n", "Epoch 1792/10000, Training Loss: 0.08191870152950287, Test Loss: 0.0707876905798912\n", "Epoch 1793/10000, Training Loss: 0.08188196271657944, Test Loss: 0.07076624035835266\n", "Epoch 1794/10000, Training Loss: 0.0818452537059784, Test Loss: 0.07074238359928131\n", "Epoch 1795/10000, Training Loss: 0.0818086490035057, Test Loss: 0.07071720063686371\n", "Epoch 1796/10000, Training Loss: 0.081772081553936, Test Loss: 0.07069884985685349\n", "Epoch 1797/10000, Training Loss: 0.08173554390668869, Test Loss: 0.07068642228841782\n", "Epoch 1798/10000, Training Loss: 0.08169908821582794, Test Loss: 0.0706716775894165\n", "Epoch 1799/10000, Training Loss: 0.08166272193193436, Test Loss: 0.07065428048372269\n", "Epoch 1800/10000, Training Loss: 0.08162639290094376, Test Loss: 0.07063378393650055\n", "Epoch 1801/10000, Training Loss: 0.08159010112285614, Test Loss: 0.07061094790697098\n", "Epoch 1802/10000, Training Loss: 0.0815538614988327, Test Loss: 0.07058709114789963\n", "Epoch 1803/10000, Training Loss: 0.08151771873235703, Test Loss: 0.07056347280740738\n", "Epoch 1804/10000, Training Loss: 0.08148164302110672, Test Loss: 0.07054749131202698\n", "Epoch 1805/10000, Training Loss: 0.08144557476043701, Test Loss: 0.07053139060735703\n", "Epoch 1806/10000, Training Loss: 0.08140960335731506, Test Loss: 0.07051487267017365\n", "Epoch 1807/10000, Training Loss: 0.0813736692070961, Test Loss: 0.07049764692783356\n", "Epoch 1808/10000, Training Loss: 0.0813378095626831, Test Loss: 0.07047930359840393\n", "Epoch 1809/10000, Training Loss: 0.0813019797205925, Test Loss: 0.07046660035848618\n", "Epoch 1810/10000, Training Loss: 0.08126623183488846, Test Loss: 0.07045161724090576\n", "Epoch 1811/10000, Training Loss: 0.08123056590557098, Test Loss: 0.0704343169927597\n", "Epoch 1812/10000, Training Loss: 0.08119489997625351, Test Loss: 0.07041498273611069\n", "Epoch 1813/10000, Training Loss: 0.0811593234539032, Test Loss: 0.07039409875869751\n", "Epoch 1814/10000, Training Loss: 0.08112378418445587, Test Loss: 0.07037230581045151\n", "Epoch 1815/10000, Training Loss: 0.0810883492231369, Test Loss: 0.07035701721906662\n", "Epoch 1816/10000, Training Loss: 0.08105292916297913, Test Loss: 0.07034719735383987\n", "Epoch 1817/10000, Training Loss: 0.08101756125688553, Test Loss: 0.07033425569534302\n", "Epoch 1818/10000, Training Loss: 0.08098224550485611, Test Loss: 0.07031790912151337\n", "Epoch 1819/10000, Training Loss: 0.08094701915979385, Test Loss: 0.07029817253351212\n", "Epoch 1820/10000, Training Loss: 0.08091184496879578, Test Loss: 0.07027586549520493\n", "Epoch 1821/10000, Training Loss: 0.08087673783302307, Test Loss: 0.07025198638439178\n", "Epoch 1822/10000, Training Loss: 0.08084163069725037, Test Loss: 0.07022744417190552\n", "Epoch 1823/10000, Training Loss: 0.08080664277076721, Test Loss: 0.07020846009254456\n", "Epoch 1824/10000, Training Loss: 0.08077169209718704, Test Loss: 0.0701952800154686\n", "Epoch 1825/10000, Training Loss: 0.08073681592941284, Test Loss: 0.07018013298511505\n", "Epoch 1826/10000, Training Loss: 0.08070199191570282, Test Loss: 0.0701628103852272\n", "Epoch 1827/10000, Training Loss: 0.08066723495721817, Test Loss: 0.07014355808496475\n", "Epoch 1828/10000, Training Loss: 0.08063250035047531, Test Loss: 0.07012298703193665\n", "Epoch 1829/10000, Training Loss: 0.08059785515069962, Test Loss: 0.07010127604007721\n", "Epoch 1830/10000, Training Loss: 0.0805632546544075, Test Loss: 0.07007931917905807\n", "Epoch 1831/10000, Training Loss: 0.08052871376276016, Test Loss: 0.07005847990512848\n", "Epoch 1832/10000, Training Loss: 0.08049421012401581, Test Loss: 0.07003920525312424\n", "Epoch 1833/10000, Training Loss: 0.08045975118875504, Test Loss: 0.07002171874046326\n", "Epoch 1834/10000, Training Loss: 0.08042537420988083, Test Loss: 0.07000596076250076\n", "Epoch 1835/10000, Training Loss: 0.0803910344839096, Test Loss: 0.06999154388904572\n", "Epoch 1836/10000, Training Loss: 0.08035674691200256, Test Loss: 0.06997738033533096\n", "Epoch 1837/10000, Training Loss: 0.08032253384590149, Test Loss: 0.06996316462755203\n", "Epoch 1838/10000, Training Loss: 0.0802883729338646, Test Loss: 0.06994866579771042\n", "Epoch 1839/10000, Training Loss: 0.08025428652763367, Test Loss: 0.0699336975812912\n", "Epoch 1840/10000, Training Loss: 0.08022018522024155, Test Loss: 0.06991783529520035\n", "Epoch 1841/10000, Training Loss: 0.0801861509680748, Test Loss: 0.06990114599466324\n", "Epoch 1842/10000, Training Loss: 0.08015218377113342, Test Loss: 0.06988315284252167\n", "Epoch 1843/10000, Training Loss: 0.08011831343173981, Test Loss: 0.06986472755670547\n", "Epoch 1844/10000, Training Loss: 0.08008445054292679, Test Loss: 0.06984614580869675\n", "Epoch 1845/10000, Training Loss: 0.08005065470933914, Test Loss: 0.06982728093862534\n", "Epoch 1846/10000, Training Loss: 0.08001691102981567, Test Loss: 0.06980899721384048\n", "Epoch 1847/10000, Training Loss: 0.07998321205377579, Test Loss: 0.06979160010814667\n", "Epoch 1848/10000, Training Loss: 0.07994955033063889, Test Loss: 0.06977507472038269\n", "Epoch 1849/10000, Training Loss: 0.07991596311330795, Test Loss: 0.06975948065519333\n", "Epoch 1850/10000, Training Loss: 0.0798824355006218, Test Loss: 0.06974402815103531\n", "Epoch 1851/10000, Training Loss: 0.07984897494316101, Test Loss: 0.06972853094339371\n", "Epoch 1852/10000, Training Loss: 0.07981550693511963, Test Loss: 0.06971330940723419\n", "Epoch 1853/10000, Training Loss: 0.0797821432352066, Test Loss: 0.06969814002513885\n", "Epoch 1854/10000, Training Loss: 0.07974881678819656, Test Loss: 0.06968284398317337\n", "Epoch 1855/10000, Training Loss: 0.0797155350446701, Test Loss: 0.0696672722697258\n", "Epoch 1856/10000, Training Loss: 0.07968228310346603, Test Loss: 0.06965135782957077\n", "Epoch 1857/10000, Training Loss: 0.07964911311864853, Test Loss: 0.06963504105806351\n", "Epoch 1858/10000, Training Loss: 0.0796159952878952, Test Loss: 0.06961848586797714\n", "Epoch 1859/10000, Training Loss: 0.07958289980888367, Test Loss: 0.06960126012563705\n", "Epoch 1860/10000, Training Loss: 0.07954990118741989, Test Loss: 0.06958367675542831\n", "Epoch 1861/10000, Training Loss: 0.0795169323682785, Test Loss: 0.06956636160612106\n", "Epoch 1862/10000, Training Loss: 0.0794840008020401, Test Loss: 0.06954973191022873\n", "Epoch 1863/10000, Training Loss: 0.07945112884044647, Test Loss: 0.06953370571136475\n", "Epoch 1864/10000, Training Loss: 0.07941830158233643, Test Loss: 0.06951827555894852\n", "Epoch 1865/10000, Training Loss: 0.07938552647829056, Test Loss: 0.0695032998919487\n", "Epoch 1866/10000, Training Loss: 0.07935278862714767, Test Loss: 0.0694882720708847\n", "Epoch 1867/10000, Training Loss: 0.07932011783123016, Test Loss: 0.06947294622659683\n", "Epoch 1868/10000, Training Loss: 0.07928753644227982, Test Loss: 0.0694577619433403\n", "Epoch 1869/10000, Training Loss: 0.07925495505332947, Test Loss: 0.06944254040718079\n", "Epoch 1870/10000, Training Loss: 0.07922239601612091, Test Loss: 0.06942740082740784\n", "Epoch 1871/10000, Training Loss: 0.07918992638587952, Test Loss: 0.06941211968660355\n", "Epoch 1872/10000, Training Loss: 0.0791575014591217, Test Loss: 0.06939661502838135\n", "Epoch 1873/10000, Training Loss: 0.07912512868642807, Test Loss: 0.06938106566667557\n", "Epoch 1874/10000, Training Loss: 0.07909279316663742, Test Loss: 0.06936541199684143\n", "Epoch 1875/10000, Training Loss: 0.07906053960323334, Test Loss: 0.06934954226016998\n", "Epoch 1876/10000, Training Loss: 0.07902827858924866, Test Loss: 0.06933337450027466\n", "Epoch 1877/10000, Training Loss: 0.07899613678455353, Test Loss: 0.06931688636541367\n", "Epoch 1878/10000, Training Loss: 0.0789639875292778, Test Loss: 0.06930084526538849\n", "Epoch 1879/10000, Training Loss: 0.07893192768096924, Test Loss: 0.06928534805774689\n", "Epoch 1880/10000, Training Loss: 0.07889987528324127, Test Loss: 0.06927032023668289\n", "Epoch 1881/10000, Training Loss: 0.07886788994073868, Test Loss: 0.0692557543516159\n", "Epoch 1882/10000, Training Loss: 0.07883594185113907, Test Loss: 0.06924134492874146\n", "Epoch 1883/10000, Training Loss: 0.07880404591560364, Test Loss: 0.06922706961631775\n", "Epoch 1884/10000, Training Loss: 0.07877220213413239, Test Loss: 0.06921275705099106\n", "Epoch 1885/10000, Training Loss: 0.07874041050672531, Test Loss: 0.06919778883457184\n", "Epoch 1886/10000, Training Loss: 0.07870867103338242, Test Loss: 0.06918276101350784\n", "Epoch 1887/10000, Training Loss: 0.07867696136236191, Test Loss: 0.06916747242212296\n", "Epoch 1888/10000, Training Loss: 0.07864529639482498, Test Loss: 0.06915178894996643\n", "Epoch 1889/10000, Training Loss: 0.07861372828483582, Test Loss: 0.06913617253303528\n", "Epoch 1890/10000, Training Loss: 0.07858213037252426, Test Loss: 0.06912078708410263\n", "Epoch 1891/10000, Training Loss: 0.07855065166950226, Test Loss: 0.06910577416419983\n", "Epoch 1892/10000, Training Loss: 0.07851915061473846, Test Loss: 0.0690910592675209\n", "Epoch 1893/10000, Training Loss: 0.07848773151636124, Test Loss: 0.0690765306353569\n", "Epoch 1894/10000, Training Loss: 0.07845635712146759, Test Loss: 0.06906227767467499\n", "Epoch 1895/10000, Training Loss: 0.07842505723237991, Test Loss: 0.06904759258031845\n", "Epoch 1896/10000, Training Loss: 0.07839378714561462, Test Loss: 0.06903272122144699\n", "Epoch 1897/10000, Training Loss: 0.07836255431175232, Test Loss: 0.0690179169178009\n", "Epoch 1898/10000, Training Loss: 0.0783313512802124, Test Loss: 0.06900342553853989\n", "Epoch 1899/10000, Training Loss: 0.07830020040273666, Test Loss: 0.06898903101682663\n", "Epoch 1900/10000, Training Loss: 0.07826913893222809, Test Loss: 0.06897472590208054\n", "Epoch 1901/10000, Training Loss: 0.07823807001113892, Test Loss: 0.0689605250954628\n", "Epoch 1902/10000, Training Loss: 0.07820706814527512, Test Loss: 0.06894640624523163\n", "Epoch 1903/10000, Training Loss: 0.07817613333463669, Test Loss: 0.06893220543861389\n", "Epoch 1904/10000, Training Loss: 0.07814519107341766, Test Loss: 0.06891792267560959\n", "Epoch 1905/10000, Training Loss: 0.07811436057090759, Test Loss: 0.06890355050563812\n", "Epoch 1906/10000, Training Loss: 0.07808354496955872, Test Loss: 0.06888910382986069\n", "Epoch 1907/10000, Training Loss: 0.07805275171995163, Test Loss: 0.06887420266866684\n", "Epoch 1908/10000, Training Loss: 0.07802203297615051, Test Loss: 0.06885943561792374\n", "Epoch 1909/10000, Training Loss: 0.07799132913351059, Test Loss: 0.06884446740150452\n", "Epoch 1910/10000, Training Loss: 0.07796068489551544, Test Loss: 0.06882978230714798\n", "Epoch 1911/10000, Training Loss: 0.07793009281158447, Test Loss: 0.06881555169820786\n", "Epoch 1912/10000, Training Loss: 0.0778995230793953, Test Loss: 0.06880169361829758\n", "Epoch 1913/10000, Training Loss: 0.07786901295185089, Test Loss: 0.06878811120986938\n", "Epoch 1914/10000, Training Loss: 0.07783858478069305, Test Loss: 0.0687747523188591\n", "Epoch 1915/10000, Training Loss: 0.07780813425779343, Test Loss: 0.06876134127378464\n", "Epoch 1916/10000, Training Loss: 0.07777781039476395, Test Loss: 0.0687478631734848\n", "Epoch 1917/10000, Training Loss: 0.07774744182825089, Test Loss: 0.06873434782028198\n", "Epoch 1918/10000, Training Loss: 0.07771718502044678, Test Loss: 0.0687204897403717\n", "Epoch 1919/10000, Training Loss: 0.07768691331148148, Test Loss: 0.06870611011981964\n", "Epoch 1920/10000, Training Loss: 0.07765674591064453, Test Loss: 0.06869121640920639\n", "Epoch 1921/10000, Training Loss: 0.07762658596038818, Test Loss: 0.06867646425962448\n", "Epoch 1922/10000, Training Loss: 0.07759649306535721, Test Loss: 0.06866208463907242\n", "Epoch 1923/10000, Training Loss: 0.07756638526916504, Test Loss: 0.06864815205335617\n", "Epoch 1924/10000, Training Loss: 0.07753638923168182, Test Loss: 0.06863457709550858\n", "Epoch 1925/10000, Training Loss: 0.07750639319419861, Test Loss: 0.06862134486436844\n", "Epoch 1926/10000, Training Loss: 0.07747647911310196, Test Loss: 0.06860830634832382\n", "Epoch 1927/10000, Training Loss: 0.07744657248258591, Test Loss: 0.06859535723924637\n", "Epoch 1928/10000, Training Loss: 0.07741671800613403, Test Loss: 0.06858236342668533\n", "Epoch 1929/10000, Training Loss: 0.07738693803548813, Test Loss: 0.06856925785541534\n", "Epoch 1930/10000, Training Loss: 0.07735718041658401, Test Loss: 0.06855593621730804\n", "Epoch 1931/10000, Training Loss: 0.0773274227976799, Test Loss: 0.06854241341352463\n", "Epoch 1932/10000, Training Loss: 0.07729776203632355, Test Loss: 0.06852823495864868\n", "Epoch 1933/10000, Training Loss: 0.07726813852787018, Test Loss: 0.06851401925086975\n", "Epoch 1934/10000, Training Loss: 0.0772385373711586, Test Loss: 0.06849997490644455\n", "Epoch 1935/10000, Training Loss: 0.07720896601676941, Test Loss: 0.06848609447479248\n", "Epoch 1936/10000, Training Loss: 0.077179454267025, Test Loss: 0.0684724897146225\n", "Epoch 1937/10000, Training Loss: 0.07715000957250595, Test Loss: 0.06845886260271072\n", "Epoch 1938/10000, Training Loss: 0.0771205723285675, Test Loss: 0.06844557821750641\n", "Epoch 1939/10000, Training Loss: 0.07709123939275742, Test Loss: 0.06843264400959015\n", "Epoch 1940/10000, Training Loss: 0.07706186920404434, Test Loss: 0.06841994822025299\n", "Epoch 1941/10000, Training Loss: 0.07703258097171783, Test Loss: 0.06840743124485016\n", "Epoch 1942/10000, Training Loss: 0.07700331509113312, Test Loss: 0.06839488446712494\n", "Epoch 1943/10000, Training Loss: 0.07697407901287079, Test Loss: 0.06838231533765793\n", "Epoch 1944/10000, Training Loss: 0.07694490253925323, Test Loss: 0.0683695375919342\n", "Epoch 1945/10000, Training Loss: 0.07691577821969986, Test Loss: 0.06835605949163437\n", "Epoch 1946/10000, Training Loss: 0.07688670605421066, Test Loss: 0.06834245473146439\n", "Epoch 1947/10000, Training Loss: 0.07685764878988266, Test Loss: 0.06832888722419739\n", "Epoch 1948/10000, Training Loss: 0.07682865113019943, Test Loss: 0.06831539422273636\n", "Epoch 1949/10000, Training Loss: 0.076799675822258, Test Loss: 0.06830213218927383\n", "Epoch 1950/10000, Training Loss: 0.07677068561315536, Test Loss: 0.06828910112380981\n", "Epoch 1951/10000, Training Loss: 0.07674184441566467, Test Loss: 0.06827633082866669\n", "Epoch 1952/10000, Training Loss: 0.07671298831701279, Test Loss: 0.06826372444629669\n", "Epoch 1953/10000, Training Loss: 0.07668416947126389, Test Loss: 0.06825082004070282\n", "Epoch 1954/10000, Training Loss: 0.07665540277957916, Test Loss: 0.06823804974555969\n", "Epoch 1955/10000, Training Loss: 0.07662671059370041, Test Loss: 0.06822546571493149\n", "Epoch 1956/10000, Training Loss: 0.07659796625375748, Test Loss: 0.06821300089359283\n", "Epoch 1957/10000, Training Loss: 0.07656937092542648, Test Loss: 0.06820057332515717\n", "Epoch 1958/10000, Training Loss: 0.0765407532453537, Test Loss: 0.0681881308555603\n", "Epoch 1959/10000, Training Loss: 0.07651218771934509, Test Loss: 0.06817560642957687\n", "Epoch 1960/10000, Training Loss: 0.07648365199565887, Test Loss: 0.06816305965185165\n", "Epoch 1961/10000, Training Loss: 0.07645516842603683, Test Loss: 0.06814999133348465\n", "Epoch 1962/10000, Training Loss: 0.07642675936222076, Test Loss: 0.0681370198726654\n", "Epoch 1963/10000, Training Loss: 0.0763983502984047, Test Loss: 0.06812416017055511\n", "Epoch 1964/10000, Training Loss: 0.07636997848749161, Test Loss: 0.06811146438121796\n", "Epoch 1965/10000, Training Loss: 0.07634162902832031, Test Loss: 0.06809895485639572\n", "Epoch 1966/10000, Training Loss: 0.0763133317232132, Test Loss: 0.06808654218912125\n", "Epoch 1967/10000, Training Loss: 0.07628508657217026, Test Loss: 0.06807436048984528\n", "Epoch 1968/10000, Training Loss: 0.0762569010257721, Test Loss: 0.06806225329637527\n", "Epoch 1969/10000, Training Loss: 0.07622873038053513, Test Loss: 0.06805024296045303\n", "Epoch 1970/10000, Training Loss: 0.07620058953762054, Test Loss: 0.06803813576698303\n", "Epoch 1971/10000, Training Loss: 0.07617250084877014, Test Loss: 0.06802558153867722\n", "Epoch 1972/10000, Training Loss: 0.07614445686340332, Test Loss: 0.06801308691501617\n", "Epoch 1973/10000, Training Loss: 0.07611642777919769, Test Loss: 0.06800059974193573\n", "Epoch 1974/10000, Training Loss: 0.07608847320079803, Test Loss: 0.06798822432756424\n", "Epoch 1975/10000, Training Loss: 0.07606053352355957, Test Loss: 0.0679759755730629\n", "Epoch 1976/10000, Training Loss: 0.0760326236486435, Test Loss: 0.06796389818191528\n", "Epoch 1977/10000, Training Loss: 0.076004758477211, Test Loss: 0.06795182824134827\n", "Epoch 1978/10000, Training Loss: 0.07597694545984268, Test Loss: 0.0679398700594902\n", "Epoch 1979/10000, Training Loss: 0.07594911754131317, Test Loss: 0.06792792677879333\n", "Epoch 1980/10000, Training Loss: 0.07592140138149261, Test Loss: 0.06791563332080841\n", "Epoch 1981/10000, Training Loss: 0.07589369267225266, Test Loss: 0.06790342181921005\n", "Epoch 1982/10000, Training Loss: 0.07586600631475449, Test Loss: 0.06789131462574005\n", "Epoch 1983/10000, Training Loss: 0.0758383721113205, Test Loss: 0.06787935644388199\n", "Epoch 1984/10000, Training Loss: 0.07581079006195068, Test Loss: 0.06786751747131348\n", "Epoch 1985/10000, Training Loss: 0.07578319311141968, Test Loss: 0.06785572320222855\n", "Epoch 1986/10000, Training Loss: 0.07575572282075882, Test Loss: 0.06784398853778839\n", "Epoch 1987/10000, Training Loss: 0.07572822272777557, Test Loss: 0.0678323358297348\n", "Epoch 1988/10000, Training Loss: 0.0757007896900177, Test Loss: 0.06782057136297226\n", "Epoch 1989/10000, Training Loss: 0.07567334920167923, Test Loss: 0.06780881434679031\n", "Epoch 1990/10000, Training Loss: 0.07564600557088852, Test Loss: 0.06779704242944717\n", "Epoch 1991/10000, Training Loss: 0.0756186693906784, Test Loss: 0.06778481602668762\n", "Epoch 1992/10000, Training Loss: 0.07559137046337128, Test Loss: 0.06777270138263702\n", "Epoch 1993/10000, Training Loss: 0.07556409388780594, Test Loss: 0.06776076555252075\n", "Epoch 1994/10000, Training Loss: 0.07553688436746597, Test Loss: 0.06774896383285522\n", "Epoch 1995/10000, Training Loss: 0.0755096971988678, Test Loss: 0.06773733347654343\n", "Epoch 1996/10000, Training Loss: 0.0754825621843338, Test Loss: 0.06772595643997192\n", "Epoch 1997/10000, Training Loss: 0.0754554346203804, Test Loss: 0.0677146390080452\n", "Epoch 1998/10000, Training Loss: 0.07542836666107178, Test Loss: 0.06770336627960205\n", "Epoch 1999/10000, Training Loss: 0.07540131360292435, Test Loss: 0.06769199669361115\n", "Epoch 2000/10000, Training Loss: 0.07537432014942169, Test Loss: 0.067680723965168\n", "Epoch 2001/10000, Training Loss: 0.07534734159708023, Test Loss: 0.0676693320274353\n", "Epoch 2002/10000, Training Loss: 0.07532040774822235, Test Loss: 0.06765741109848022\n", "Epoch 2003/10000, Training Loss: 0.07529352605342865, Test Loss: 0.06764546781778336\n", "Epoch 2004/10000, Training Loss: 0.07526664435863495, Test Loss: 0.06763370335102081\n", "Epoch 2005/10000, Training Loss: 0.07523985207080841, Test Loss: 0.06762216985225677\n", "Epoch 2006/10000, Training Loss: 0.07521302998065948, Test Loss: 0.06761074811220169\n", "Epoch 2007/10000, Training Loss: 0.07518629729747772, Test Loss: 0.06759949773550034\n", "Epoch 2008/10000, Training Loss: 0.07515959441661835, Test Loss: 0.0675884261727333\n", "Epoch 2009/10000, Training Loss: 0.07513289898633957, Test Loss: 0.06757740676403046\n", "Epoch 2010/10000, Training Loss: 0.07510626316070557, Test Loss: 0.06756644695997238\n", "Epoch 2011/10000, Training Loss: 0.07507961988449097, Test Loss: 0.06755544990301132\n", "Epoch 2012/10000, Training Loss: 0.07505308836698532, Test Loss: 0.06754431873559952\n", "Epoch 2013/10000, Training Loss: 0.07502652704715729, Test Loss: 0.06753318011760712\n", "Epoch 2014/10000, Training Loss: 0.07500002533197403, Test Loss: 0.06752196699380875\n", "Epoch 2015/10000, Training Loss: 0.07497354596853256, Test Loss: 0.06751023232936859\n", "Epoch 2016/10000, Training Loss: 0.07494711875915527, Test Loss: 0.0674985870718956\n", "Epoch 2017/10000, Training Loss: 0.07492072135210037, Test Loss: 0.06748712807893753\n", "Epoch 2018/10000, Training Loss: 0.07489436119794846, Test Loss: 0.06747592985630035\n", "Epoch 2019/10000, Training Loss: 0.07486805319786072, Test Loss: 0.0674649029970169\n", "Epoch 2020/10000, Training Loss: 0.07484173774719238, Test Loss: 0.06745409965515137\n", "Epoch 2021/10000, Training Loss: 0.07481549680233002, Test Loss: 0.0674433782696724\n", "Epoch 2022/10000, Training Loss: 0.07478925585746765, Test Loss: 0.06743284314870834\n", "Epoch 2023/10000, Training Loss: 0.07476305216550827, Test Loss: 0.0674222782254219\n", "Epoch 2024/10000, Training Loss: 0.07473692297935486, Test Loss: 0.06741160899400711\n", "Epoch 2025/10000, Training Loss: 0.07471077889204025, Test Loss: 0.06740084290504456\n", "Epoch 2026/10000, Training Loss: 0.07468466460704803, Test Loss: 0.06738992780447006\n", "Epoch 2027/10000, Training Loss: 0.07465864717960358, Test Loss: 0.06737881898880005\n", "Epoch 2028/10000, Training Loss: 0.07463262975215912, Test Loss: 0.06736740469932556\n", "Epoch 2029/10000, Training Loss: 0.07460664212703705, Test Loss: 0.06735607236623764\n", "Epoch 2030/10000, Training Loss: 0.07458066940307617, Test Loss: 0.06734490394592285\n", "Epoch 2031/10000, Training Loss: 0.07455476373434067, Test Loss: 0.06733403354883194\n", "Epoch 2032/10000, Training Loss: 0.07452887296676636, Test Loss: 0.06732335686683655\n", "Epoch 2033/10000, Training Loss: 0.07450301945209503, Test Loss: 0.06731297820806503\n", "Epoch 2034/10000, Training Loss: 0.0744771808385849, Test Loss: 0.06730262190103531\n", "Epoch 2035/10000, Training Loss: 0.07445144653320312, Test Loss: 0.06729240715503693\n", "Epoch 2036/10000, Training Loss: 0.07442568242549896, Test Loss: 0.06728217005729675\n", "Epoch 2037/10000, Training Loss: 0.07439997792243958, Test Loss: 0.0672718733549118\n", "Epoch 2038/10000, Training Loss: 0.07437428086996078, Test Loss: 0.06726142019033432\n", "Epoch 2039/10000, Training Loss: 0.07434864342212677, Test Loss: 0.06725070625543594\n", "Epoch 2040/10000, Training Loss: 0.07432302832603455, Test Loss: 0.0672399252653122\n", "Epoch 2041/10000, Training Loss: 0.07429743558168411, Test Loss: 0.06722914427518845\n", "Epoch 2042/10000, Training Loss: 0.07427188754081726, Test Loss: 0.06721822917461395\n", "Epoch 2043/10000, Training Loss: 0.0742463767528534, Test Loss: 0.0672074556350708\n", "Epoch 2044/10000, Training Loss: 0.07422089576721191, Test Loss: 0.06719671189785004\n", "Epoch 2045/10000, Training Loss: 0.07419545948505402, Test Loss: 0.06718579679727554\n", "Epoch 2046/10000, Training Loss: 0.07417000830173492, Test Loss: 0.06717514991760254\n", "Epoch 2047/10000, Training Loss: 0.074144646525383, Test Loss: 0.0671648159623146\n", "Epoch 2048/10000, Training Loss: 0.07411926239728928, Test Loss: 0.06715458631515503\n", "Epoch 2049/10000, Training Loss: 0.07409395277500153, Test Loss: 0.0671447291970253\n", "Epoch 2050/10000, Training Loss: 0.07406868040561676, Test Loss: 0.06713487207889557\n", "Epoch 2051/10000, Training Loss: 0.0740433931350708, Test Loss: 0.06712505221366882\n", "Epoch 2052/10000, Training Loss: 0.07401818037033081, Test Loss: 0.0671151876449585\n", "Epoch 2053/10000, Training Loss: 0.07399298995733261, Test Loss: 0.06710508465766907\n", "Epoch 2054/10000, Training Loss: 0.073967844247818, Test Loss: 0.0670948326587677\n", "Epoch 2055/10000, Training Loss: 0.07394268363714218, Test Loss: 0.0670844241976738\n", "Epoch 2056/10000, Training Loss: 0.07391759753227234, Test Loss: 0.06707404553890228\n", "Epoch 2057/10000, Training Loss: 0.0738925114274025, Test Loss: 0.06706344336271286\n", "Epoch 2058/10000, Training Loss: 0.07386751472949982, Test Loss: 0.0670529305934906\n", "Epoch 2059/10000, Training Loss: 0.07384248077869415, Test Loss: 0.06704249978065491\n", "Epoch 2060/10000, Training Loss: 0.07381753623485565, Test Loss: 0.06703224778175354\n", "Epoch 2061/10000, Training Loss: 0.07379259169101715, Test Loss: 0.06702210754156113\n", "Epoch 2062/10000, Training Loss: 0.07376768440008163, Test Loss: 0.06701218336820602\n", "Epoch 2063/10000, Training Loss: 0.07374284416437149, Test Loss: 0.06700193881988525\n", "Epoch 2064/10000, Training Loss: 0.07371796667575836, Test Loss: 0.06699184328317642\n", "Epoch 2065/10000, Training Loss: 0.0736931711435318, Test Loss: 0.06698188185691833\n", "Epoch 2066/10000, Training Loss: 0.07366839051246643, Test Loss: 0.06697216629981995\n", "Epoch 2067/10000, Training Loss: 0.07364365458488464, Test Loss: 0.06696245074272156\n", "Epoch 2068/10000, Training Loss: 0.07361890375614166, Test Loss: 0.06695283204317093\n", "Epoch 2069/10000, Training Loss: 0.07359422743320465, Test Loss: 0.06694319099187851\n", "Epoch 2070/10000, Training Loss: 0.07356957346200943, Test Loss: 0.06693343818187714\n", "Epoch 2071/10000, Training Loss: 0.07354491949081421, Test Loss: 0.06692367047071457\n", "Epoch 2072/10000, Training Loss: 0.07352038472890854, Test Loss: 0.06691377609968185\n", "Epoch 2073/10000, Training Loss: 0.07349579781293869, Test Loss: 0.0669037401676178\n", "Epoch 2074/10000, Training Loss: 0.07347126305103302, Test Loss: 0.06689366698265076\n", "Epoch 2075/10000, Training Loss: 0.07344675809144974, Test Loss: 0.06688360124826431\n", "Epoch 2076/10000, Training Loss: 0.07342228293418884, Test Loss: 0.0668737068772316\n", "Epoch 2077/10000, Training Loss: 0.07339784502983093, Test Loss: 0.06686384230852127\n", "Epoch 2078/10000, Training Loss: 0.0733734592795372, Test Loss: 0.0668540894985199\n", "Epoch 2079/10000, Training Loss: 0.07334910333156586, Test Loss: 0.06684442609548569\n", "Epoch 2080/10000, Training Loss: 0.07332473248243332, Test Loss: 0.06683488935232162\n", "Epoch 2081/10000, Training Loss: 0.07330039888620377, Test Loss: 0.06682533025741577\n", "Epoch 2082/10000, Training Loss: 0.0732761099934578, Test Loss: 0.06681574881076813\n", "Epoch 2083/10000, Training Loss: 0.0732518658041954, Test Loss: 0.06680621951818466\n", "Epoch 2084/10000, Training Loss: 0.07322762161493301, Test Loss: 0.06679637730121613\n", "Epoch 2085/10000, Training Loss: 0.07320341467857361, Test Loss: 0.06678643077611923\n", "Epoch 2086/10000, Training Loss: 0.07317925989627838, Test Loss: 0.06677675992250443\n", "Epoch 2087/10000, Training Loss: 0.07315510511398315, Test Loss: 0.0667671486735344\n", "Epoch 2088/10000, Training Loss: 0.07313099503517151, Test Loss: 0.06675774604082108\n", "Epoch 2089/10000, Training Loss: 0.07310691475868225, Test Loss: 0.06674834340810776\n", "Epoch 2090/10000, Training Loss: 0.073082834482193, Test Loss: 0.06673911958932877\n", "Epoch 2091/10000, Training Loss: 0.0730588436126709, Test Loss: 0.06672988831996918\n", "Epoch 2092/10000, Training Loss: 0.07303481549024582, Test Loss: 0.06672061234712601\n", "Epoch 2093/10000, Training Loss: 0.0730108916759491, Test Loss: 0.06671133637428284\n", "Epoch 2094/10000, Training Loss: 0.07298697531223297, Test Loss: 0.06670185923576355\n", "Epoch 2095/10000, Training Loss: 0.07296305149793625, Test Loss: 0.06669238954782486\n", "Epoch 2096/10000, Training Loss: 0.0729391947388649, Test Loss: 0.06668294221162796\n", "Epoch 2097/10000, Training Loss: 0.07291534543037415, Test Loss: 0.06667350977659225\n", "Epoch 2098/10000, Training Loss: 0.07289151847362518, Test Loss: 0.06666406989097595\n", "Epoch 2099/10000, Training Loss: 0.0728677362203598, Test Loss: 0.06665468215942383\n", "Epoch 2100/10000, Training Loss: 0.07284398376941681, Test Loss: 0.0666453093290329\n", "Epoch 2101/10000, Training Loss: 0.0728202536702156, Test Loss: 0.06663603335618973\n", "Epoch 2102/10000, Training Loss: 0.072796531021595, Test Loss: 0.06662687659263611\n", "Epoch 2103/10000, Training Loss: 0.07277286052703857, Test Loss: 0.06661774218082428\n", "Epoch 2104/10000, Training Loss: 0.07274921238422394, Test Loss: 0.06660868972539902\n", "Epoch 2105/10000, Training Loss: 0.07272560894489288, Test Loss: 0.06659957766532898\n", "Epoch 2106/10000, Training Loss: 0.07270202040672302, Test Loss: 0.06659046560525894\n", "Epoch 2107/10000, Training Loss: 0.07267848402261734, Test Loss: 0.06658132374286652\n", "Epoch 2108/10000, Training Loss: 0.07265496253967285, Test Loss: 0.06657225638628006\n", "Epoch 2109/10000, Training Loss: 0.07263143360614777, Test Loss: 0.06656312197446823\n", "Epoch 2110/10000, Training Loss: 0.07260797917842865, Test Loss: 0.0665540099143982\n", "Epoch 2111/10000, Training Loss: 0.07258453965187073, Test Loss: 0.06654483824968338\n", "Epoch 2112/10000, Training Loss: 0.072561115026474, Test Loss: 0.06653527915477753\n", "Epoch 2113/10000, Training Loss: 0.07253775000572205, Test Loss: 0.06652595847845078\n", "Epoch 2114/10000, Training Loss: 0.07251438498497009, Test Loss: 0.06651675701141357\n", "Epoch 2115/10000, Training Loss: 0.07249105721712112, Test Loss: 0.06650789082050323\n", "Epoch 2116/10000, Training Loss: 0.07246774435043335, Test Loss: 0.06649922579526901\n", "Epoch 2117/10000, Training Loss: 0.07244446128606796, Test Loss: 0.06649059057235718\n", "Epoch 2118/10000, Training Loss: 0.07242120802402496, Test Loss: 0.06648200005292892\n", "Epoch 2119/10000, Training Loss: 0.07239802926778793, Test Loss: 0.06647348403930664\n", "Epoch 2120/10000, Training Loss: 0.07237483561038971, Test Loss: 0.06646477431058884\n", "Epoch 2121/10000, Training Loss: 0.07235165685415268, Test Loss: 0.06645598262548447\n", "Epoch 2122/10000, Training Loss: 0.07232850790023804, Test Loss: 0.06644702702760696\n", "Epoch 2123/10000, Training Loss: 0.07230536639690399, Test Loss: 0.06643807142972946\n", "Epoch 2124/10000, Training Loss: 0.0722823515534401, Test Loss: 0.0664290189743042\n", "Epoch 2125/10000, Training Loss: 0.07225927710533142, Test Loss: 0.06641992926597595\n", "Epoch 2126/10000, Training Loss: 0.07223625481128693, Test Loss: 0.06641098856925964\n", "Epoch 2127/10000, Training Loss: 0.07221323996782303, Test Loss: 0.0664021447300911\n", "Epoch 2128/10000, Training Loss: 0.07219026237726212, Test Loss: 0.06639330089092255\n", "Epoch 2129/10000, Training Loss: 0.07216735929250717, Test Loss: 0.06638465076684952\n", "Epoch 2130/10000, Training Loss: 0.07214441150426865, Test Loss: 0.06637602299451828\n", "Epoch 2131/10000, Training Loss: 0.07212154567241669, Test Loss: 0.06636745482683182\n", "Epoch 2132/10000, Training Loss: 0.07209865748882294, Test Loss: 0.06635897606611252\n", "Epoch 2133/10000, Training Loss: 0.07207581400871277, Test Loss: 0.06635040044784546\n", "Epoch 2134/10000, Training Loss: 0.07205300778150558, Test Loss: 0.0663418099284172\n", "Epoch 2135/10000, Training Loss: 0.07203022390604019, Test Loss: 0.06633323431015015\n", "Epoch 2136/10000, Training Loss: 0.07200746238231659, Test Loss: 0.06632450222969055\n", "Epoch 2137/10000, Training Loss: 0.07198474556207657, Test Loss: 0.06631576269865036\n", "Epoch 2138/10000, Training Loss: 0.07196200639009476, Test Loss: 0.06630709767341614\n", "Epoch 2139/10000, Training Loss: 0.07193935662508011, Test Loss: 0.06629843264818192\n", "Epoch 2140/10000, Training Loss: 0.07191669195890427, Test Loss: 0.06628991663455963\n", "Epoch 2141/10000, Training Loss: 0.0718940868973732, Test Loss: 0.06628129631280899\n", "Epoch 2142/10000, Training Loss: 0.07187148928642273, Test Loss: 0.06627285480499268\n", "Epoch 2143/10000, Training Loss: 0.07184889912605286, Test Loss: 0.06626442819833755\n", "Epoch 2144/10000, Training Loss: 0.07182637602090836, Test Loss: 0.06625602394342422\n", "Epoch 2145/10000, Training Loss: 0.07180385291576385, Test Loss: 0.06624764949083328\n", "Epoch 2146/10000, Training Loss: 0.07178135216236115, Test Loss: 0.06623925268650055\n", "Epoch 2147/10000, Training Loss: 0.07175889611244202, Test Loss: 0.0662308856844902\n", "Epoch 2148/10000, Training Loss: 0.07173644751310349, Test Loss: 0.06622246652841568\n", "Epoch 2149/10000, Training Loss: 0.07171402871608734, Test Loss: 0.06621406227350235\n", "Epoch 2150/10000, Training Loss: 0.07169169187545776, Test Loss: 0.06620568037033081\n", "Epoch 2151/10000, Training Loss: 0.07166928797960281, Test Loss: 0.0661972165107727\n", "Epoch 2152/10000, Training Loss: 0.07164692878723145, Test Loss: 0.06618839502334595\n", "Epoch 2153/10000, Training Loss: 0.07162464410066605, Test Loss: 0.0661797747015953\n", "Epoch 2154/10000, Training Loss: 0.07160232216119766, Test Loss: 0.0661713182926178\n", "Epoch 2155/10000, Training Loss: 0.07158008217811584, Test Loss: 0.06616313755512238\n", "Epoch 2156/10000, Training Loss: 0.07155787944793701, Test Loss: 0.0661550834774971\n", "Epoch 2157/10000, Training Loss: 0.07153566181659698, Test Loss: 0.06614714860916138\n", "Epoch 2158/10000, Training Loss: 0.07151350378990173, Test Loss: 0.0661393478512764\n", "Epoch 2159/10000, Training Loss: 0.07149133086204529, Test Loss: 0.06613140553236008\n", "Epoch 2160/10000, Training Loss: 0.07146919518709183, Test Loss: 0.06612349301576614\n", "Epoch 2161/10000, Training Loss: 0.07144707441329956, Test Loss: 0.06611531227827072\n", "Epoch 2162/10000, Training Loss: 0.07142500579357147, Test Loss: 0.06610705703496933\n", "Epoch 2163/10000, Training Loss: 0.07140294462442398, Test Loss: 0.06609875708818436\n", "Epoch 2164/10000, Training Loss: 0.07138094305992126, Test Loss: 0.06609032303094864\n", "Epoch 2165/10000, Training Loss: 0.07135891169309616, Test Loss: 0.0660819560289383\n", "Epoch 2166/10000, Training Loss: 0.07133690267801285, Test Loss: 0.06607364863157272\n", "Epoch 2167/10000, Training Loss: 0.0713149756193161, Test Loss: 0.06606545299291611\n", "Epoch 2168/10000, Training Loss: 0.07129302620887756, Test Loss: 0.06605731695890427\n", "Epoch 2169/10000, Training Loss: 0.0712711364030838, Test Loss: 0.06604933738708496\n", "Epoch 2170/10000, Training Loss: 0.07124928385019302, Test Loss: 0.06604140251874924\n", "Epoch 2171/10000, Training Loss: 0.07122737169265747, Test Loss: 0.06603359431028366\n", "Epoch 2172/10000, Training Loss: 0.07120554894208908, Test Loss: 0.06602579355239868\n", "Epoch 2173/10000, Training Loss: 0.07118376344442368, Test Loss: 0.06601795554161072\n", "Epoch 2174/10000, Training Loss: 0.07116196304559708, Test Loss: 0.06601008772850037\n", "Epoch 2175/10000, Training Loss: 0.07114022225141525, Test Loss: 0.06600211560726166\n", "Epoch 2176/10000, Training Loss: 0.07111847400665283, Test Loss: 0.06599406898021698\n", "Epoch 2177/10000, Training Loss: 0.07109678536653519, Test Loss: 0.06598600000143051\n", "Epoch 2178/10000, Training Loss: 0.07107507437467575, Test Loss: 0.06597795337438583\n", "Epoch 2179/10000, Training Loss: 0.07105343788862228, Test Loss: 0.06596999615430832\n", "Epoch 2180/10000, Training Loss: 0.07103177905082703, Test Loss: 0.06596197932958603\n", "Epoch 2181/10000, Training Loss: 0.07101017236709595, Test Loss: 0.06595402210950851\n", "Epoch 2182/10000, Training Loss: 0.07098857313394547, Test Loss: 0.06594612449407578\n", "Epoch 2183/10000, Training Loss: 0.07096704095602036, Test Loss: 0.06593834608793259\n", "Epoch 2184/10000, Training Loss: 0.07094549387693405, Test Loss: 0.0659305602312088\n", "Epoch 2185/10000, Training Loss: 0.07092396169900894, Test Loss: 0.06592285633087158\n", "Epoch 2186/10000, Training Loss: 0.0709024965763092, Test Loss: 0.06591521948575974\n", "Epoch 2187/10000, Training Loss: 0.07088100165128708, Test Loss: 0.06590746343135834\n", "Epoch 2188/10000, Training Loss: 0.07085958868265152, Test Loss: 0.06589971482753754\n", "Epoch 2189/10000, Training Loss: 0.07083814591169357, Test Loss: 0.06589198112487793\n", "Epoch 2190/10000, Training Loss: 0.0708167552947998, Test Loss: 0.06588419526815414\n", "Epoch 2191/10000, Training Loss: 0.07079537212848663, Test Loss: 0.06587649881839752\n", "Epoch 2192/10000, Training Loss: 0.07077401876449585, Test Loss: 0.06586872041225433\n", "Epoch 2193/10000, Training Loss: 0.07075268775224686, Test Loss: 0.0658610537648201\n", "Epoch 2194/10000, Training Loss: 0.07073141634464264, Test Loss: 0.06585334986448288\n", "Epoch 2195/10000, Training Loss: 0.07071015238761902, Test Loss: 0.06584560126066208\n", "Epoch 2196/10000, Training Loss: 0.0706888884305954, Test Loss: 0.0658382698893547\n", "Epoch 2197/10000, Training Loss: 0.07066763192415237, Test Loss: 0.06583113968372345\n", "Epoch 2198/10000, Training Loss: 0.07064647227525711, Test Loss: 0.06582415848970413\n", "Epoch 2199/10000, Training Loss: 0.07062528282403946, Test Loss: 0.06581710278987885\n", "Epoch 2200/10000, Training Loss: 0.07060413062572479, Test Loss: 0.06581001728773117\n", "Epoch 2201/10000, Training Loss: 0.07058299332857132, Test Loss: 0.06580261141061783\n", "Epoch 2202/10000, Training Loss: 0.07056190073490143, Test Loss: 0.06579495966434479\n", "Epoch 2203/10000, Training Loss: 0.07054080069065094, Test Loss: 0.06578712910413742\n", "Epoch 2204/10000, Training Loss: 0.07051973044872284, Test Loss: 0.06577906757593155\n", "Epoch 2205/10000, Training Loss: 0.07049870491027832, Test Loss: 0.06577090173959732\n", "Epoch 2206/10000, Training Loss: 0.0704776793718338, Test Loss: 0.06576280295848846\n", "Epoch 2207/10000, Training Loss: 0.07045669108629227, Test Loss: 0.06575489789247513\n", "Epoch 2208/10000, Training Loss: 0.07043575495481491, Test Loss: 0.06574717164039612\n", "Epoch 2209/10000, Training Loss: 0.07041478902101517, Test Loss: 0.06573964655399323\n", "Epoch 2210/10000, Training Loss: 0.07039384543895721, Test Loss: 0.06573227047920227\n", "Epoch 2211/10000, Training Loss: 0.07037296146154404, Test Loss: 0.06572499126195908\n", "Epoch 2212/10000, Training Loss: 0.07035208493471146, Test Loss: 0.06571788340806961\n", "Epoch 2213/10000, Training Loss: 0.07033124566078186, Test Loss: 0.06571067124605179\n", "Epoch 2214/10000, Training Loss: 0.07031040638685226, Test Loss: 0.0657033696770668\n", "Epoch 2215/10000, Training Loss: 0.07028960436582565, Test Loss: 0.06569595634937286\n", "Epoch 2216/10000, Training Loss: 0.07026880979537964, Test Loss: 0.06568850576877594\n", "Epoch 2217/10000, Training Loss: 0.0702480673789978, Test Loss: 0.0656808614730835\n", "Epoch 2218/10000, Training Loss: 0.07022731006145477, Test Loss: 0.06567317992448807\n", "Epoch 2219/10000, Training Loss: 0.07020658254623413, Test Loss: 0.06566546857357025\n", "Epoch 2220/10000, Training Loss: 0.07018587738275528, Test Loss: 0.0656578466296196\n", "Epoch 2221/10000, Training Loss: 0.07016519457101822, Test Loss: 0.0656503289937973\n", "Epoch 2222/10000, Training Loss: 0.07014454901218414, Test Loss: 0.06564285606145859\n", "Epoch 2223/10000, Training Loss: 0.07012391835451126, Test Loss: 0.06563548743724823\n", "Epoch 2224/10000, Training Loss: 0.07010329514741898, Test Loss: 0.06562825292348862\n", "Epoch 2225/10000, Training Loss: 0.07008273154497147, Test Loss: 0.06562109291553497\n", "Epoch 2226/10000, Training Loss: 0.07006210088729858, Test Loss: 0.0656140148639679\n", "Epoch 2227/10000, Training Loss: 0.07004162669181824, Test Loss: 0.06560683995485306\n", "Epoch 2228/10000, Training Loss: 0.07002108544111252, Test Loss: 0.06559969484806061\n", "Epoch 2229/10000, Training Loss: 0.07000059634447098, Test Loss: 0.06559241563081741\n", "Epoch 2230/10000, Training Loss: 0.06998012214899063, Test Loss: 0.06558509916067123\n", "Epoch 2231/10000, Training Loss: 0.06995964795351028, Test Loss: 0.06557773798704147\n", "Epoch 2232/10000, Training Loss: 0.06993919610977173, Test Loss: 0.06557038426399231\n", "Epoch 2233/10000, Training Loss: 0.06991880387067795, Test Loss: 0.06556306034326553\n", "Epoch 2234/10000, Training Loss: 0.06989840418100357, Test Loss: 0.06555574387311935\n", "Epoch 2235/10000, Training Loss: 0.06987802684307098, Test Loss: 0.06554844975471497\n", "Epoch 2236/10000, Training Loss: 0.06985766440629959, Test Loss: 0.06554137170314789\n", "Epoch 2237/10000, Training Loss: 0.06983736157417297, Test Loss: 0.06553422659635544\n", "Epoch 2238/10000, Training Loss: 0.06981702148914337, Test Loss: 0.06552720069885254\n", "Epoch 2239/10000, Training Loss: 0.06979681551456451, Test Loss: 0.06552016735076904\n", "Epoch 2240/10000, Training Loss: 0.06977652758359909, Test Loss: 0.06551311165094376\n", "Epoch 2241/10000, Training Loss: 0.06975627690553665, Test Loss: 0.06550616025924683\n", "Epoch 2242/10000, Training Loss: 0.0697360411286354, Test Loss: 0.06549907475709915\n", "Epoch 2243/10000, Training Loss: 0.06971584260463715, Test Loss: 0.06549200415611267\n", "Epoch 2244/10000, Training Loss: 0.06969568133354187, Test Loss: 0.0654849261045456\n", "Epoch 2245/10000, Training Loss: 0.06967547535896301, Test Loss: 0.06547790765762329\n", "Epoch 2246/10000, Training Loss: 0.0696553960442543, Test Loss: 0.06547084450721741\n", "Epoch 2247/10000, Training Loss: 0.06963522732257843, Test Loss: 0.0654638409614563\n", "Epoch 2248/10000, Training Loss: 0.06961517035961151, Test Loss: 0.06545685976743698\n", "Epoch 2249/10000, Training Loss: 0.06959506869316101, Test Loss: 0.06544986367225647\n", "Epoch 2250/10000, Training Loss: 0.06957504898309708, Test Loss: 0.06544294208288193\n", "Epoch 2251/10000, Training Loss: 0.06955499947071075, Test Loss: 0.06543595343828201\n", "Epoch 2252/10000, Training Loss: 0.06953500211238861, Test Loss: 0.06542905420064926\n", "Epoch 2253/10000, Training Loss: 0.06951500475406647, Test Loss: 0.0654221624135971\n", "Epoch 2254/10000, Training Loss: 0.0694950520992279, Test Loss: 0.06541527062654495\n", "Epoch 2255/10000, Training Loss: 0.06947509199380875, Test Loss: 0.0654083862900734\n", "Epoch 2256/10000, Training Loss: 0.06945516914129257, Test Loss: 0.0654015764594078\n", "Epoch 2257/10000, Training Loss: 0.06943529099225998, Test Loss: 0.065394826233387\n", "Epoch 2258/10000, Training Loss: 0.0694153755903244, Test Loss: 0.0653880164027214\n", "Epoch 2259/10000, Training Loss: 0.0693955197930336, Test Loss: 0.0653812363743782\n", "Epoch 2260/10000, Training Loss: 0.0693756639957428, Test Loss: 0.06537439674139023\n", "Epoch 2261/10000, Training Loss: 0.06935585290193558, Test Loss: 0.06536755710840225\n", "Epoch 2262/10000, Training Loss: 0.06933604925870895, Test Loss: 0.06536073982715607\n", "Epoch 2263/10000, Training Loss: 0.06931626796722412, Test Loss: 0.06535393744707108\n", "Epoch 2264/10000, Training Loss: 0.0692964717745781, Test Loss: 0.0653470978140831\n", "Epoch 2265/10000, Training Loss: 0.06927676498889923, Test Loss: 0.06534036993980408\n", "Epoch 2266/10000, Training Loss: 0.06925702095031738, Test Loss: 0.06533359736204147\n", "Epoch 2267/10000, Training Loss: 0.06923731416463852, Test Loss: 0.06532686203718185\n", "Epoch 2268/10000, Training Loss: 0.06921765208244324, Test Loss: 0.06532025337219238\n", "Epoch 2269/10000, Training Loss: 0.06919798254966736, Test Loss: 0.06531355530023575\n", "Epoch 2270/10000, Training Loss: 0.06917832791805267, Test Loss: 0.06530685722827911\n", "Epoch 2271/10000, Training Loss: 0.06915871798992157, Test Loss: 0.0653003379702568\n", "Epoch 2272/10000, Training Loss: 0.06913909316062927, Test Loss: 0.06529378145933151\n", "Epoch 2273/10000, Training Loss: 0.06911952793598175, Test Loss: 0.06528723984956741\n", "Epoch 2274/10000, Training Loss: 0.06909997016191483, Test Loss: 0.06528059393167496\n", "Epoch 2275/10000, Training Loss: 0.0690804198384285, Test Loss: 0.06527391076087952\n", "Epoch 2276/10000, Training Loss: 0.06906090676784515, Test Loss: 0.06526731699705124\n", "Epoch 2277/10000, Training Loss: 0.06904138624668121, Test Loss: 0.06526060402393341\n", "Epoch 2278/10000, Training Loss: 0.06902185827493668, Test Loss: 0.06525403261184692\n", "Epoch 2279/10000, Training Loss: 0.0690024346113205, Test Loss: 0.06524751335382462\n", "Epoch 2280/10000, Training Loss: 0.06898298859596252, Test Loss: 0.0652410015463829\n", "Epoch 2281/10000, Training Loss: 0.06896352022886276, Test Loss: 0.06523447483778\n", "Epoch 2282/10000, Training Loss: 0.06894415616989136, Test Loss: 0.0652279183268547\n", "Epoch 2283/10000, Training Loss: 0.06892475485801697, Test Loss: 0.06522142142057419\n", "Epoch 2284/10000, Training Loss: 0.06890541315078735, Test Loss: 0.06521502882242203\n", "Epoch 2285/10000, Training Loss: 0.06888604164123535, Test Loss: 0.06520859152078629\n", "Epoch 2286/10000, Training Loss: 0.06886670738458633, Test Loss: 0.06520214676856995\n", "Epoch 2287/10000, Training Loss: 0.06884738802909851, Test Loss: 0.06519570201635361\n", "Epoch 2288/10000, Training Loss: 0.06882811337709427, Test Loss: 0.06518927961587906\n", "Epoch 2289/10000, Training Loss: 0.06880881637334824, Test Loss: 0.06518277525901794\n", "Epoch 2290/10000, Training Loss: 0.06878959387540817, Test Loss: 0.06517639756202698\n", "Epoch 2291/10000, Training Loss: 0.06877031177282333, Test Loss: 0.06517002731561661\n", "Epoch 2292/10000, Training Loss: 0.06875114142894745, Test Loss: 0.06516356766223907\n", "Epoch 2293/10000, Training Loss: 0.06873193383216858, Test Loss: 0.06515717506408691\n", "Epoch 2294/10000, Training Loss: 0.0687127411365509, Test Loss: 0.06515076011419296\n", "Epoch 2295/10000, Training Loss: 0.06869357824325562, Test Loss: 0.06514439731836319\n", "Epoch 2296/10000, Training Loss: 0.06867443025112152, Test Loss: 0.065138079226017\n", "Epoch 2297/10000, Training Loss: 0.0686553344130516, Test Loss: 0.0651317834854126\n", "Epoch 2298/10000, Training Loss: 0.06863624602556229, Test Loss: 0.06512553989887238\n", "Epoch 2299/10000, Training Loss: 0.06861713528633118, Test Loss: 0.06511928141117096\n", "Epoch 2300/10000, Training Loss: 0.06859809160232544, Test Loss: 0.06511294841766357\n", "Epoch 2301/10000, Training Loss: 0.0685790479183197, Test Loss: 0.06510663777589798\n", "Epoch 2302/10000, Training Loss: 0.06856000423431396, Test Loss: 0.06510034203529358\n", "Epoch 2303/10000, Training Loss: 0.0685410276055336, Test Loss: 0.06509407609701157\n", "Epoch 2304/10000, Training Loss: 0.06852199882268906, Test Loss: 0.06508785486221313\n", "Epoch 2305/10000, Training Loss: 0.06850302964448929, Test Loss: 0.0650816336274147\n", "Epoch 2306/10000, Training Loss: 0.06848408281803131, Test Loss: 0.06507539004087448\n", "Epoch 2307/10000, Training Loss: 0.06846518069505692, Test Loss: 0.06506924331188202\n", "Epoch 2308/10000, Training Loss: 0.06844624131917953, Test Loss: 0.06506305932998657\n", "Epoch 2309/10000, Training Loss: 0.06842736899852753, Test Loss: 0.06505687534809113\n", "Epoch 2310/10000, Training Loss: 0.06840845942497253, Test Loss: 0.06505069136619568\n", "Epoch 2311/10000, Training Loss: 0.06838962435722351, Test Loss: 0.06504455208778381\n", "Epoch 2312/10000, Training Loss: 0.06837078928947449, Test Loss: 0.06503838300704956\n", "Epoch 2313/10000, Training Loss: 0.06835192441940308, Test Loss: 0.06503219902515411\n", "Epoch 2314/10000, Training Loss: 0.06833311170339584, Test Loss: 0.06502607464790344\n", "Epoch 2315/10000, Training Loss: 0.06831429898738861, Test Loss: 0.06501993536949158\n", "Epoch 2316/10000, Training Loss: 0.06829556077718735, Test Loss: 0.0650138333439827\n", "Epoch 2317/10000, Training Loss: 0.06827680766582489, Test Loss: 0.06500768661499023\n", "Epoch 2318/10000, Training Loss: 0.06825806200504303, Test Loss: 0.06500169634819031\n", "Epoch 2319/10000, Training Loss: 0.06823934614658356, Test Loss: 0.06499563157558441\n", "Epoch 2320/10000, Training Loss: 0.06822066009044647, Test Loss: 0.06498955190181732\n", "Epoch 2321/10000, Training Loss: 0.06820198893547058, Test Loss: 0.06498357653617859\n", "Epoch 2322/10000, Training Loss: 0.06818333268165588, Test Loss: 0.06497752666473389\n", "Epoch 2323/10000, Training Loss: 0.06816469132900238, Test Loss: 0.0649714469909668\n", "Epoch 2324/10000, Training Loss: 0.06814606487751007, Test Loss: 0.06496548652648926\n", "Epoch 2325/10000, Training Loss: 0.06812742352485657, Test Loss: 0.06495945155620575\n", "Epoch 2326/10000, Training Loss: 0.06810885667800903, Test Loss: 0.06495331227779388\n", "Epoch 2327/10000, Training Loss: 0.0680902823805809, Test Loss: 0.06494728475809097\n", "Epoch 2328/10000, Training Loss: 0.06807172298431396, Test Loss: 0.06494128704071045\n", "Epoch 2329/10000, Training Loss: 0.06805317848920822, Test Loss: 0.0649353563785553\n", "Epoch 2330/10000, Training Loss: 0.06803462654352188, Test Loss: 0.06492946296930313\n", "Epoch 2331/10000, Training Loss: 0.06801614910364151, Test Loss: 0.06492356956005096\n", "Epoch 2332/10000, Training Loss: 0.06799769401550293, Test Loss: 0.06491771340370178\n", "Epoch 2333/10000, Training Loss: 0.06797918677330017, Test Loss: 0.0649118423461914\n", "Epoch 2334/10000, Training Loss: 0.06796073913574219, Test Loss: 0.06490597128868103\n", "Epoch 2335/10000, Training Loss: 0.06794233620166779, Test Loss: 0.06490007787942886\n", "Epoch 2336/10000, Training Loss: 0.06792391091585159, Test Loss: 0.06489413976669312\n", "Epoch 2337/10000, Training Loss: 0.0679054856300354, Test Loss: 0.06488823890686035\n", "Epoch 2338/10000, Training Loss: 0.06788710504770279, Test Loss: 0.06488233059644699\n", "Epoch 2339/10000, Training Loss: 0.06786876171827316, Test Loss: 0.06487636268138885\n", "Epoch 2340/10000, Training Loss: 0.06785042583942413, Test Loss: 0.06487056612968445\n", "Epoch 2341/10000, Training Loss: 0.0678320899605751, Test Loss: 0.06486470252275467\n", "Epoch 2342/10000, Training Loss: 0.06781376898288727, Test Loss: 0.06485892087221146\n", "Epoch 2343/10000, Training Loss: 0.06779544055461884, Test Loss: 0.06485313177108765\n", "Epoch 2344/10000, Training Loss: 0.06777720153331757, Test Loss: 0.06484735012054443\n", "Epoch 2345/10000, Training Loss: 0.06775891035795212, Test Loss: 0.06484159827232361\n", "Epoch 2346/10000, Training Loss: 0.06774066388607025, Test Loss: 0.0648357942700386\n", "Epoch 2347/10000, Training Loss: 0.06772243231534958, Test Loss: 0.06483004987239838\n", "Epoch 2348/10000, Training Loss: 0.06770424544811249, Test Loss: 0.06482426822185516\n", "Epoch 2349/10000, Training Loss: 0.067686066031456, Test Loss: 0.06481852382421494\n", "Epoch 2350/10000, Training Loss: 0.06766786426305771, Test Loss: 0.0648128017783165\n", "Epoch 2351/10000, Training Loss: 0.06764969974756241, Test Loss: 0.06480704247951508\n", "Epoch 2352/10000, Training Loss: 0.0676315501332283, Test Loss: 0.06480131298303604\n", "Epoch 2353/10000, Training Loss: 0.06761341542005539, Test Loss: 0.06479556858539581\n", "Epoch 2354/10000, Training Loss: 0.06759534031152725, Test Loss: 0.06478981673717499\n", "Epoch 2355/10000, Training Loss: 0.06757725030183792, Test Loss: 0.0647842288017273\n", "Epoch 2356/10000, Training Loss: 0.06755916029214859, Test Loss: 0.06477856636047363\n", "Epoch 2357/10000, Training Loss: 0.06754108518362045, Test Loss: 0.06477298587560654\n", "Epoch 2358/10000, Training Loss: 0.0675230622291565, Test Loss: 0.06476724147796631\n", "Epoch 2359/10000, Training Loss: 0.06750502437353134, Test Loss: 0.06476157158613205\n", "Epoch 2360/10000, Training Loss: 0.06748700141906738, Test Loss: 0.06475593894720078\n", "Epoch 2361/10000, Training Loss: 0.0674690455198288, Test Loss: 0.06475027650594711\n", "Epoch 2362/10000, Training Loss: 0.06745103746652603, Test Loss: 0.06474467366933823\n", "Epoch 2363/10000, Training Loss: 0.06743308901786804, Test Loss: 0.06473901122808456\n", "Epoch 2364/10000, Training Loss: 0.06741509586572647, Test Loss: 0.06473341584205627\n", "Epoch 2365/10000, Training Loss: 0.06739718466997147, Test Loss: 0.06472792476415634\n", "Epoch 2366/10000, Training Loss: 0.06737927347421646, Test Loss: 0.06472240388393402\n", "Epoch 2367/10000, Training Loss: 0.06736141443252563, Test Loss: 0.06471684575080872\n", "Epoch 2368/10000, Training Loss: 0.06734354794025421, Test Loss: 0.06471136957406998\n", "Epoch 2369/10000, Training Loss: 0.06732562929391861, Test Loss: 0.06470569968223572\n", "Epoch 2370/10000, Training Loss: 0.06730781495571136, Test Loss: 0.06470006704330444\n", "Epoch 2371/10000, Training Loss: 0.06728995591402054, Test Loss: 0.06469450145959854\n", "Epoch 2372/10000, Training Loss: 0.06727214902639389, Test Loss: 0.0646890252828598\n", "Epoch 2373/10000, Training Loss: 0.06725437194108963, Test Loss: 0.06468353420495987\n", "Epoch 2374/10000, Training Loss: 0.06723657250404358, Test Loss: 0.06467805802822113\n", "Epoch 2375/10000, Training Loss: 0.06721878051757812, Test Loss: 0.06467262655496597\n", "Epoch 2376/10000, Training Loss: 0.06720102578401566, Test Loss: 0.06466715037822723\n", "Epoch 2377/10000, Training Loss: 0.06718332320451736, Test Loss: 0.06466178596019745\n", "Epoch 2378/10000, Training Loss: 0.06716558337211609, Test Loss: 0.06465625017881393\n", "Epoch 2379/10000, Training Loss: 0.0671478882431984, Test Loss: 0.06465078145265579\n", "Epoch 2380/10000, Training Loss: 0.0671301931142807, Test Loss: 0.06464533507823944\n", "Epoch 2381/10000, Training Loss: 0.0671125277876854, Test Loss: 0.06463982909917831\n", "Epoch 2382/10000, Training Loss: 0.06709486991167068, Test Loss: 0.06463434547185898\n", "Epoch 2383/10000, Training Loss: 0.06707722693681717, Test Loss: 0.06462904065847397\n", "Epoch 2384/10000, Training Loss: 0.06705960631370544, Test Loss: 0.06462368369102478\n", "Epoch 2385/10000, Training Loss: 0.0670420229434967, Test Loss: 0.0646183043718338\n", "Epoch 2386/10000, Training Loss: 0.06702437996864319, Test Loss: 0.06461286544799805\n", "Epoch 2387/10000, Training Loss: 0.06700681149959564, Test Loss: 0.06460750102996826\n", "Epoch 2388/10000, Training Loss: 0.06698929518461227, Test Loss: 0.06460213661193848\n", "Epoch 2389/10000, Training Loss: 0.06697170436382294, Test Loss: 0.0645967647433281\n", "Epoch 2390/10000, Training Loss: 0.06695418059825897, Test Loss: 0.0645914301276207\n", "Epoch 2391/10000, Training Loss: 0.06693664193153381, Test Loss: 0.06458614021539688\n", "Epoch 2392/10000, Training Loss: 0.06691917777061462, Test Loss: 0.06458081305027008\n", "Epoch 2393/10000, Training Loss: 0.06690166145563126, Test Loss: 0.0645754411816597\n", "Epoch 2394/10000, Training Loss: 0.06688421219587326, Test Loss: 0.06457005441188812\n", "Epoch 2395/10000, Training Loss: 0.06686676293611526, Test Loss: 0.0645647719502449\n", "Epoch 2396/10000, Training Loss: 0.06684934347867966, Test Loss: 0.06455949693918228\n", "Epoch 2397/10000, Training Loss: 0.06683187186717987, Test Loss: 0.06455420702695847\n", "Epoch 2398/10000, Training Loss: 0.06681444495916367, Test Loss: 0.06454894691705704\n", "Epoch 2399/10000, Training Loss: 0.06679708510637283, Test Loss: 0.06454373151063919\n", "Epoch 2400/10000, Training Loss: 0.06677969545125961, Test Loss: 0.06453853845596313\n", "Epoch 2401/10000, Training Loss: 0.06676238030195236, Test Loss: 0.06453336775302887\n", "Epoch 2402/10000, Training Loss: 0.06674497574567795, Test Loss: 0.06452807039022446\n", "Epoch 2403/10000, Training Loss: 0.0667276605963707, Test Loss: 0.06452282518148422\n", "Epoch 2404/10000, Training Loss: 0.06671028584241867, Test Loss: 0.06451763957738876\n", "Epoch 2405/10000, Training Loss: 0.0666930079460144, Test Loss: 0.06451234966516495\n", "Epoch 2406/10000, Training Loss: 0.06667573004961014, Test Loss: 0.06450720131397247\n", "Epoch 2407/10000, Training Loss: 0.06665845960378647, Test Loss: 0.06450188905000687\n", "Epoch 2408/10000, Training Loss: 0.0666411817073822, Test Loss: 0.06449670344591141\n", "Epoch 2409/10000, Training Loss: 0.06662394851446152, Test Loss: 0.06449152529239655\n", "Epoch 2410/10000, Training Loss: 0.06660674512386322, Test Loss: 0.06448635458946228\n", "Epoch 2411/10000, Training Loss: 0.06658951938152313, Test Loss: 0.06448128819465637\n", "Epoch 2412/10000, Training Loss: 0.06657231599092484, Test Loss: 0.06447619944810867\n", "Epoch 2413/10000, Training Loss: 0.06655514985322952, Test Loss: 0.06447108089923859\n", "Epoch 2414/10000, Training Loss: 0.06653796881437302, Test Loss: 0.06446591764688492\n", "Epoch 2415/10000, Training Loss: 0.06652083992958069, Test Loss: 0.06446076184511185\n", "Epoch 2416/10000, Training Loss: 0.06650367379188538, Test Loss: 0.06445556879043579\n", "Epoch 2417/10000, Training Loss: 0.06648653745651245, Test Loss: 0.06445052474737167\n", "Epoch 2418/10000, Training Loss: 0.0664694607257843, Test Loss: 0.06444544345140457\n", "Epoch 2419/10000, Training Loss: 0.06645232439041138, Test Loss: 0.06444036215543747\n", "Epoch 2420/10000, Training Loss: 0.06643524765968323, Test Loss: 0.06443525105714798\n", "Epoch 2421/10000, Training Loss: 0.06641818583011627, Test Loss: 0.06443013995885849\n", "Epoch 2422/10000, Training Loss: 0.06640113145112991, Test Loss: 0.06442510336637497\n", "Epoch 2423/10000, Training Loss: 0.06638409942388535, Test Loss: 0.06442005932331085\n", "Epoch 2424/10000, Training Loss: 0.06636705249547958, Test Loss: 0.06441511958837509\n", "Epoch 2425/10000, Training Loss: 0.066350057721138, Test Loss: 0.06441006809473038\n", "Epoch 2426/10000, Training Loss: 0.06633307784795761, Test Loss: 0.06440506130456924\n", "Epoch 2427/10000, Training Loss: 0.06631609052419662, Test Loss: 0.06440000981092453\n", "Epoch 2428/10000, Training Loss: 0.06629911810159683, Test Loss: 0.06439504027366638\n", "Epoch 2429/10000, Training Loss: 0.06628216058015823, Test Loss: 0.06439004838466644\n", "Epoch 2430/10000, Training Loss: 0.06626523286104202, Test Loss: 0.06438501924276352\n", "Epoch 2431/10000, Training Loss: 0.06624829769134521, Test Loss: 0.06438001245260239\n", "Epoch 2432/10000, Training Loss: 0.06623141467571259, Test Loss: 0.06437499821186066\n", "Epoch 2433/10000, Training Loss: 0.06621449440717697, Test Loss: 0.0643700584769249\n", "Epoch 2434/10000, Training Loss: 0.06619761139154434, Test Loss: 0.06436505168676376\n", "Epoch 2435/10000, Training Loss: 0.06618072092533112, Test Loss: 0.06436017900705338\n", "Epoch 2436/10000, Training Loss: 0.06616386771202087, Test Loss: 0.0643552765250206\n", "Epoch 2437/10000, Training Loss: 0.06614704430103302, Test Loss: 0.06435034424066544\n", "Epoch 2438/10000, Training Loss: 0.06613018363714218, Test Loss: 0.06434541195631027\n", "Epoch 2439/10000, Training Loss: 0.06611340492963791, Test Loss: 0.06434046477079391\n", "Epoch 2440/10000, Training Loss: 0.06609663367271423, Test Loss: 0.06433558464050293\n", "Epoch 2441/10000, Training Loss: 0.06607982516288757, Test Loss: 0.06433070451021194\n", "Epoch 2442/10000, Training Loss: 0.0660630613565445, Test Loss: 0.06432575732469559\n", "Epoch 2443/10000, Training Loss: 0.06604629009962082, Test Loss: 0.06432083249092102\n", "Epoch 2444/10000, Training Loss: 0.06602957099676132, Test Loss: 0.06431600451469421\n", "Epoch 2445/10000, Training Loss: 0.06601283699274063, Test Loss: 0.06431113183498383\n", "Epoch 2446/10000, Training Loss: 0.06599610298871994, Test Loss: 0.0643063634634018\n", "Epoch 2447/10000, Training Loss: 0.06597943603992462, Test Loss: 0.06430153548717499\n", "Epoch 2448/10000, Training Loss: 0.06596271693706512, Test Loss: 0.0642966777086258\n", "Epoch 2449/10000, Training Loss: 0.0659460499882698, Test Loss: 0.06429183483123779\n", "Epoch 2450/10000, Training Loss: 0.06592941284179688, Test Loss: 0.06428693979978561\n", "Epoch 2451/10000, Training Loss: 0.06591273844242096, Test Loss: 0.06428218632936478\n", "Epoch 2452/10000, Training Loss: 0.06589610874652863, Test Loss: 0.06427732110023499\n", "Epoch 2453/10000, Training Loss: 0.06587949395179749, Test Loss: 0.06427253037691116\n", "Epoch 2454/10000, Training Loss: 0.06586290150880814, Test Loss: 0.06426778435707092\n", "Epoch 2455/10000, Training Loss: 0.0658462792634964, Test Loss: 0.06426301598548889\n", "Epoch 2456/10000, Training Loss: 0.06582969427108765, Test Loss: 0.06425823271274567\n", "Epoch 2457/10000, Training Loss: 0.06581313908100128, Test Loss: 0.06425347179174423\n", "Epoch 2458/10000, Training Loss: 0.06579659134149551, Test Loss: 0.06424866616725922\n", "Epoch 2459/10000, Training Loss: 0.06578005105257034, Test Loss: 0.06424390524625778\n", "Epoch 2460/10000, Training Loss: 0.06576353311538696, Test Loss: 0.06423921138048172\n", "Epoch 2461/10000, Training Loss: 0.06574700772762299, Test Loss: 0.06423445791006088\n", "Epoch 2462/10000, Training Loss: 0.06573052704334259, Test Loss: 0.06422977149486542\n", "Epoch 2463/10000, Training Loss: 0.06571401655673981, Test Loss: 0.06422524899244308\n", "Epoch 2464/10000, Training Loss: 0.0656975582242012, Test Loss: 0.06422089785337448\n", "Epoch 2465/10000, Training Loss: 0.065681092441082, Test Loss: 0.0642165020108223\n", "Epoch 2466/10000, Training Loss: 0.06566467136144638, Test Loss: 0.0642121285200119\n", "Epoch 2467/10000, Training Loss: 0.06564822793006897, Test Loss: 0.06420768052339554\n", "Epoch 2468/10000, Training Loss: 0.06563179939985275, Test Loss: 0.06420300900936127\n", "Epoch 2469/10000, Training Loss: 0.06561542302370071, Test Loss: 0.06419822573661804\n", "Epoch 2470/10000, Training Loss: 0.0655990019440651, Test Loss: 0.06419339776039124\n", "Epoch 2471/10000, Training Loss: 0.06558264046907425, Test Loss: 0.0641884133219719\n", "Epoch 2472/10000, Training Loss: 0.06556627154350281, Test Loss: 0.06418345868587494\n", "Epoch 2473/10000, Training Loss: 0.06554991751909256, Test Loss: 0.06417861580848694\n", "Epoch 2474/10000, Training Loss: 0.06553356349468231, Test Loss: 0.06417382508516312\n", "Epoch 2475/10000, Training Loss: 0.06551726907491684, Test Loss: 0.06416921317577362\n", "Epoch 2476/10000, Training Loss: 0.06550091505050659, Test Loss: 0.06416457146406174\n", "Epoch 2477/10000, Training Loss: 0.06548462808132172, Test Loss: 0.06416011601686478\n", "Epoch 2478/10000, Training Loss: 0.06546835601329803, Test Loss: 0.06415564566850662\n", "Epoch 2479/10000, Training Loss: 0.06545207649469376, Test Loss: 0.06415120512247086\n", "Epoch 2480/10000, Training Loss: 0.06543584913015366, Test Loss: 0.06414663046598434\n", "Epoch 2481/10000, Training Loss: 0.06541956961154938, Test Loss: 0.06414210796356201\n", "Epoch 2482/10000, Training Loss: 0.06540331989526749, Test Loss: 0.06413747370243073\n", "Epoch 2483/10000, Training Loss: 0.06538709253072739, Test Loss: 0.06413274258375168\n", "Epoch 2484/10000, Training Loss: 0.06537090986967087, Test Loss: 0.06412810832262039\n", "Epoch 2485/10000, Training Loss: 0.06535470485687256, Test Loss: 0.06412338465452194\n", "Epoch 2486/10000, Training Loss: 0.06533850729465485, Test Loss: 0.06411871314048767\n", "Epoch 2487/10000, Training Loss: 0.06532235443592072, Test Loss: 0.06411415338516235\n", "Epoch 2488/10000, Training Loss: 0.06530621647834778, Test Loss: 0.06410954147577286\n", "Epoch 2489/10000, Training Loss: 0.06529003381729126, Test Loss: 0.06410504877567291\n", "Epoch 2490/10000, Training Loss: 0.06527390331029892, Test Loss: 0.06410050392150879\n", "Epoch 2491/10000, Training Loss: 0.06525778770446777, Test Loss: 0.06409607082605362\n", "Epoch 2492/10000, Training Loss: 0.06524169445037842, Test Loss: 0.06409161537885666\n", "Epoch 2493/10000, Training Loss: 0.06522558629512787, Test Loss: 0.06408717483282089\n", "Epoch 2494/10000, Training Loss: 0.0652095228433609, Test Loss: 0.06408259272575378\n", "Epoch 2495/10000, Training Loss: 0.06519340723752975, Test Loss: 0.06407807022333145\n", "Epoch 2496/10000, Training Loss: 0.06517732888460159, Test Loss: 0.0640735849738121\n", "Epoch 2497/10000, Training Loss: 0.06516129523515701, Test Loss: 0.06406906247138977\n", "Epoch 2498/10000, Training Loss: 0.06514527648687363, Test Loss: 0.06406448781490326\n", "Epoch 2499/10000, Training Loss: 0.06512929499149323, Test Loss: 0.0640600249171257\n", "Epoch 2500/10000, Training Loss: 0.06511326134204865, Test Loss: 0.06405557692050934\n", "Epoch 2501/10000, Training Loss: 0.06509724259376526, Test Loss: 0.06405112147331238\n", "Epoch 2502/10000, Training Loss: 0.06508123874664307, Test Loss: 0.06404668837785721\n", "Epoch 2503/10000, Training Loss: 0.06506527960300446, Test Loss: 0.06404224783182144\n", "Epoch 2504/10000, Training Loss: 0.06504930555820465, Test Loss: 0.06403782963752747\n", "Epoch 2505/10000, Training Loss: 0.06503335386514664, Test Loss: 0.0640334039926529\n", "Epoch 2506/10000, Training Loss: 0.0650174543261528, Test Loss: 0.06402897834777832\n", "Epoch 2507/10000, Training Loss: 0.0650014728307724, Test Loss: 0.06402459740638733\n", "Epoch 2508/10000, Training Loss: 0.06498559564352036, Test Loss: 0.064020074903965\n", "Epoch 2509/10000, Training Loss: 0.06496969610452652, Test Loss: 0.06401528418064117\n", "Epoch 2510/10000, Training Loss: 0.06495379656553268, Test Loss: 0.06401047855615616\n", "Epoch 2511/10000, Training Loss: 0.06493792682886124, Test Loss: 0.06400594115257263\n", "Epoch 2512/10000, Training Loss: 0.06492206454277039, Test Loss: 0.06400170922279358\n", "Epoch 2513/10000, Training Loss: 0.06490623205900192, Test Loss: 0.06399770081043243\n", "Epoch 2514/10000, Training Loss: 0.06489038467407227, Test Loss: 0.06399384886026382\n", "Epoch 2515/10000, Training Loss: 0.06487453728914261, Test Loss: 0.06399000436067581\n", "Epoch 2516/10000, Training Loss: 0.06485871970653534, Test Loss: 0.06398604810237885\n", "Epoch 2517/10000, Training Loss: 0.06484293937683105, Test Loss: 0.06398189067840576\n", "Epoch 2518/10000, Training Loss: 0.06482715159654617, Test Loss: 0.06397750228643417\n", "Epoch 2519/10000, Training Loss: 0.06481136381626129, Test Loss: 0.06397295743227005\n", "Epoch 2520/10000, Training Loss: 0.064795583486557, Test Loss: 0.06396830081939697\n", "Epoch 2521/10000, Training Loss: 0.06477981060743332, Test Loss: 0.06396365910768509\n", "Epoch 2522/10000, Training Loss: 0.06476406753063202, Test Loss: 0.06395914405584335\n", "Epoch 2523/10000, Training Loss: 0.06474833190441132, Test Loss: 0.06395465135574341\n", "Epoch 2524/10000, Training Loss: 0.06473260372877121, Test Loss: 0.06395027786493301\n", "Epoch 2525/10000, Training Loss: 0.06471691280603409, Test Loss: 0.0639462098479271\n", "Epoch 2526/10000, Training Loss: 0.06470116972923279, Test Loss: 0.06394201517105103\n", "Epoch 2527/10000, Training Loss: 0.06468549370765686, Test Loss: 0.06393768638372421\n", "Epoch 2528/10000, Training Loss: 0.06466984748840332, Test Loss: 0.06393348425626755\n", "Epoch 2529/10000, Training Loss: 0.06465420126914978, Test Loss: 0.06392946094274521\n", "Epoch 2530/10000, Training Loss: 0.06463853269815445, Test Loss: 0.06392540037631989\n", "Epoch 2531/10000, Training Loss: 0.06462286412715912, Test Loss: 0.06392132490873337\n", "Epoch 2532/10000, Training Loss: 0.06460727006196976, Test Loss: 0.06391730904579163\n", "Epoch 2533/10000, Training Loss: 0.06459163874387741, Test Loss: 0.06391323357820511\n", "Epoch 2534/10000, Training Loss: 0.06457603722810745, Test Loss: 0.0639091357588768\n", "Epoch 2535/10000, Training Loss: 0.06456045061349869, Test Loss: 0.0639047771692276\n", "Epoch 2536/10000, Training Loss: 0.06454482674598694, Test Loss: 0.06390047073364258\n", "Epoch 2537/10000, Training Loss: 0.06452930718660355, Test Loss: 0.0638960599899292\n", "Epoch 2538/10000, Training Loss: 0.06451372057199478, Test Loss: 0.06389173865318298\n", "Epoch 2539/10000, Training Loss: 0.0644981861114502, Test Loss: 0.06388752907514572\n", "Epoch 2540/10000, Training Loss: 0.06448263674974442, Test Loss: 0.06388332694768906\n", "Epoch 2541/10000, Training Loss: 0.06446708738803864, Test Loss: 0.06387932598590851\n", "Epoch 2542/10000, Training Loss: 0.06445160508155823, Test Loss: 0.06387520581483841\n", "Epoch 2543/10000, Training Loss: 0.06443610787391663, Test Loss: 0.06387116760015488\n", "Epoch 2544/10000, Training Loss: 0.06442061066627502, Test Loss: 0.0638672411441803\n", "Epoch 2545/10000, Training Loss: 0.06440510600805283, Test Loss: 0.0638631284236908\n", "Epoch 2546/10000, Training Loss: 0.06438964605331421, Test Loss: 0.06385885179042816\n", "Epoch 2547/10000, Training Loss: 0.06437418609857559, Test Loss: 0.06385458260774612\n", "Epoch 2548/10000, Training Loss: 0.06435874104499817, Test Loss: 0.06385031342506409\n", "Epoch 2549/10000, Training Loss: 0.06434331834316254, Test Loss: 0.063846156001091\n", "Epoch 2550/10000, Training Loss: 0.06432787328958511, Test Loss: 0.06384214013814926\n", "Epoch 2551/10000, Training Loss: 0.06431248039007187, Test Loss: 0.0638381615281105\n", "Epoch 2552/10000, Training Loss: 0.06429705023765564, Test Loss: 0.06383416801691055\n", "Epoch 2553/10000, Training Loss: 0.0642816573381424, Test Loss: 0.06383024901151657\n", "Epoch 2554/10000, Training Loss: 0.06426631659269333, Test Loss: 0.06382620334625244\n", "Epoch 2555/10000, Training Loss: 0.06425092369318008, Test Loss: 0.06382198631763458\n", "Epoch 2556/10000, Training Loss: 0.06423553079366684, Test Loss: 0.06381791085004807\n", "Epoch 2557/10000, Training Loss: 0.06422021239995956, Test Loss: 0.06381382048130035\n", "Epoch 2558/10000, Training Loss: 0.06420488655567169, Test Loss: 0.06380980461835861\n", "Epoch 2559/10000, Training Loss: 0.06418953835964203, Test Loss: 0.06380585581064224\n", "Epoch 2560/10000, Training Loss: 0.06417426466941833, Test Loss: 0.06380195170640945\n", "Epoch 2561/10000, Training Loss: 0.06415891647338867, Test Loss: 0.0637979507446289\n", "Epoch 2562/10000, Training Loss: 0.06414367258548737, Test Loss: 0.06379401683807373\n", "Epoch 2563/10000, Training Loss: 0.0641283318400383, Test Loss: 0.06379009038209915\n", "Epoch 2564/10000, Training Loss: 0.0641130805015564, Test Loss: 0.06378605216741562\n", "Epoch 2565/10000, Training Loss: 0.0640978142619133, Test Loss: 0.06378202140331268\n", "Epoch 2566/10000, Training Loss: 0.064082570374012, Test Loss: 0.06377799063920975\n", "Epoch 2567/10000, Training Loss: 0.06406733393669128, Test Loss: 0.06377391517162323\n", "Epoch 2568/10000, Training Loss: 0.06405210494995117, Test Loss: 0.06376983970403671\n", "Epoch 2569/10000, Training Loss: 0.06403687596321106, Test Loss: 0.06376580148935318\n", "Epoch 2570/10000, Training Loss: 0.06402165442705154, Test Loss: 0.06376184523105621\n", "Epoch 2571/10000, Training Loss: 0.06400646269321442, Test Loss: 0.06375795602798462\n", "Epoch 2572/10000, Training Loss: 0.06399127095937729, Test Loss: 0.06375395506620407\n", "Epoch 2573/10000, Training Loss: 0.06397611647844315, Test Loss: 0.06374998390674591\n", "Epoch 2574/10000, Training Loss: 0.063960961997509, Test Loss: 0.0637461468577385\n", "Epoch 2575/10000, Training Loss: 0.06394582241773605, Test Loss: 0.0637422725558281\n", "Epoch 2576/10000, Training Loss: 0.06393066048622131, Test Loss: 0.06373842060565948\n", "Epoch 2577/10000, Training Loss: 0.06391553580760956, Test Loss: 0.06373459845781326\n", "Epoch 2578/10000, Training Loss: 0.0639004185795784, Test Loss: 0.06373082846403122\n", "Epoch 2579/10000, Training Loss: 0.06388527899980545, Test Loss: 0.06372688710689545\n", "Epoch 2580/10000, Training Loss: 0.06387016922235489, Test Loss: 0.06372307986021042\n", "Epoch 2581/10000, Training Loss: 0.06385508924722672, Test Loss: 0.06371907889842987\n", "Epoch 2582/10000, Training Loss: 0.06384002417325974, Test Loss: 0.06371515989303589\n", "Epoch 2583/10000, Training Loss: 0.06382492184638977, Test Loss: 0.06371127814054489\n", "Epoch 2584/10000, Training Loss: 0.06380988657474518, Test Loss: 0.06370734423398972\n", "Epoch 2585/10000, Training Loss: 0.0637948215007782, Test Loss: 0.0637034997344017\n", "Epoch 2586/10000, Training Loss: 0.06377977132797241, Test Loss: 0.06369956582784653\n", "Epoch 2587/10000, Training Loss: 0.06376475095748901, Test Loss: 0.06369570642709732\n", "Epoch 2588/10000, Training Loss: 0.06374971568584442, Test Loss: 0.06369189918041229\n", "Epoch 2589/10000, Training Loss: 0.06373472511768341, Test Loss: 0.06368809193372726\n", "Epoch 2590/10000, Training Loss: 0.063719742000103, Test Loss: 0.06368433684110641\n", "Epoch 2591/10000, Training Loss: 0.06370474398136139, Test Loss: 0.06368029862642288\n", "Epoch 2592/10000, Training Loss: 0.06368979066610336, Test Loss: 0.06367643177509308\n", "Epoch 2593/10000, Training Loss: 0.06367479264736176, Test Loss: 0.06367257237434387\n", "Epoch 2594/10000, Training Loss: 0.06365984678268433, Test Loss: 0.06366883218288422\n", "Epoch 2595/10000, Training Loss: 0.0636449083685875, Test Loss: 0.06366521120071411\n", "Epoch 2596/10000, Training Loss: 0.06362996250391006, Test Loss: 0.0636616051197052\n", "Epoch 2597/10000, Training Loss: 0.06361503899097443, Test Loss: 0.06365795433521271\n", "Epoch 2598/10000, Training Loss: 0.06360010802745819, Test Loss: 0.06365418434143066\n", "Epoch 2599/10000, Training Loss: 0.06358521431684494, Test Loss: 0.0636504590511322\n", "Epoch 2600/10000, Training Loss: 0.06357027590274811, Test Loss: 0.06364654749631882\n", "Epoch 2601/10000, Training Loss: 0.06355541199445724, Test Loss: 0.063642717897892\n", "Epoch 2602/10000, Training Loss: 0.06354053318500519, Test Loss: 0.06363888084888458\n", "Epoch 2603/10000, Training Loss: 0.06352570652961731, Test Loss: 0.06363507360219955\n", "Epoch 2604/10000, Training Loss: 0.06351080536842346, Test Loss: 0.0636313185095787\n", "Epoch 2605/10000, Training Loss: 0.06349596381187439, Test Loss: 0.06362760066986084\n", "Epoch 2606/10000, Training Loss: 0.06348113715648651, Test Loss: 0.06362392753362656\n", "Epoch 2607/10000, Training Loss: 0.06346629559993744, Test Loss: 0.06362030655145645\n", "Epoch 2608/10000, Training Loss: 0.06345148384571075, Test Loss: 0.06361658871173859\n", "Epoch 2609/10000, Training Loss: 0.06343669444322586, Test Loss: 0.06361299753189087\n", "Epoch 2610/10000, Training Loss: 0.06342189759016037, Test Loss: 0.06360923498868942\n", "Epoch 2611/10000, Training Loss: 0.06340710073709488, Test Loss: 0.06360550224781036\n", "Epoch 2612/10000, Training Loss: 0.06339231133460999, Test Loss: 0.06360179930925369\n", "Epoch 2613/10000, Training Loss: 0.06337755918502808, Test Loss: 0.06359806656837463\n", "Epoch 2614/10000, Training Loss: 0.06336277723312378, Test Loss: 0.06359437108039856\n", "Epoch 2615/10000, Training Loss: 0.06334806233644485, Test Loss: 0.06359061598777771\n", "Epoch 2616/10000, Training Loss: 0.06333329528570175, Test Loss: 0.06358694285154343\n", "Epoch 2617/10000, Training Loss: 0.06331858038902283, Test Loss: 0.0635833591222763\n", "Epoch 2618/10000, Training Loss: 0.06330385059118271, Test Loss: 0.06357970833778381\n", "Epoch 2619/10000, Training Loss: 0.06328913569450378, Test Loss: 0.06357602775096893\n", "Epoch 2620/10000, Training Loss: 0.06327445060014725, Test Loss: 0.06357242912054062\n", "Epoch 2621/10000, Training Loss: 0.06325975060462952, Test Loss: 0.06356880813837051\n", "Epoch 2622/10000, Training Loss: 0.06324510276317596, Test Loss: 0.06356526911258698\n", "Epoch 2623/10000, Training Loss: 0.06323041021823883, Test Loss: 0.06356167793273926\n", "Epoch 2624/10000, Training Loss: 0.06321575492620468, Test Loss: 0.06355810165405273\n", "Epoch 2625/10000, Training Loss: 0.06320110708475113, Test Loss: 0.0635543242096901\n", "Epoch 2626/10000, Training Loss: 0.06318645924329758, Test Loss: 0.06355072557926178\n", "Epoch 2627/10000, Training Loss: 0.06317180395126343, Test Loss: 0.06354701519012451\n", "Epoch 2628/10000, Training Loss: 0.06315721571445465, Test Loss: 0.06354343146085739\n", "Epoch 2629/10000, Training Loss: 0.0631425753235817, Test Loss: 0.06353987008333206\n", "Epoch 2630/10000, Training Loss: 0.06312795728445053, Test Loss: 0.06353627145290375\n", "Epoch 2631/10000, Training Loss: 0.06311336159706116, Test Loss: 0.063532754778862\n", "Epoch 2632/10000, Training Loss: 0.06309879571199417, Test Loss: 0.06352914124727249\n", "Epoch 2633/10000, Training Loss: 0.06308422982692719, Test Loss: 0.06352561712265015\n", "Epoch 2634/10000, Training Loss: 0.06306963413953781, Test Loss: 0.06352196633815765\n", "Epoch 2635/10000, Training Loss: 0.06305507570505142, Test Loss: 0.06351815909147263\n", "Epoch 2636/10000, Training Loss: 0.063040591776371, Test Loss: 0.06351464241743088\n", "Epoch 2637/10000, Training Loss: 0.06302601844072342, Test Loss: 0.06351111084222794\n", "Epoch 2638/10000, Training Loss: 0.06301148235797882, Test Loss: 0.06350767612457275\n", "Epoch 2639/10000, Training Loss: 0.06299694627523422, Test Loss: 0.06350434571504593\n", "Epoch 2640/10000, Training Loss: 0.06298244744539261, Test Loss: 0.06350094079971313\n", "Epoch 2641/10000, Training Loss: 0.06296791881322861, Test Loss: 0.06349749118089676\n", "Epoch 2642/10000, Training Loss: 0.06295345723628998, Test Loss: 0.06349396705627441\n", "Epoch 2643/10000, Training Loss: 0.06293898075819016, Test Loss: 0.06349049508571625\n", "Epoch 2644/10000, Training Loss: 0.06292448937892914, Test Loss: 0.06348691135644913\n", "Epoch 2645/10000, Training Loss: 0.0629100576043129, Test Loss: 0.06348336488008499\n", "Epoch 2646/10000, Training Loss: 0.06289558857679367, Test Loss: 0.06347968429327011\n", "Epoch 2647/10000, Training Loss: 0.06288112699985504, Test Loss: 0.06347619742155075\n", "Epoch 2648/10000, Training Loss: 0.0628666877746582, Test Loss: 0.06347265839576721\n", "Epoch 2649/10000, Training Loss: 0.06285228580236435, Test Loss: 0.06346908211708069\n", "Epoch 2650/10000, Training Loss: 0.0628378614783287, Test Loss: 0.0634656623005867\n", "Epoch 2651/10000, Training Loss: 0.06282345950603485, Test Loss: 0.06346230208873749\n", "Epoch 2652/10000, Training Loss: 0.0628090351819992, Test Loss: 0.06345898658037186\n", "Epoch 2653/10000, Training Loss: 0.06279466301202774, Test Loss: 0.06345561891794205\n", "Epoch 2654/10000, Training Loss: 0.06278028339147568, Test Loss: 0.06345228105783463\n", "Epoch 2655/10000, Training Loss: 0.0627659261226654, Test Loss: 0.06344877183437347\n", "Epoch 2656/10000, Training Loss: 0.06275155395269394, Test Loss: 0.0634453147649765\n", "Epoch 2657/10000, Training Loss: 0.06273719668388367, Test Loss: 0.06344147026538849\n", "Epoch 2658/10000, Training Loss: 0.06272285431623459, Test Loss: 0.06343735009431839\n", "Epoch 2659/10000, Training Loss: 0.06270848959684372, Test Loss: 0.06343334913253784\n", "Epoch 2660/10000, Training Loss: 0.06269419938325882, Test Loss: 0.06342943757772446\n", "Epoch 2661/10000, Training Loss: 0.06267986446619034, Test Loss: 0.06342580914497375\n", "Epoch 2662/10000, Training Loss: 0.06266552209854126, Test Loss: 0.0634227767586708\n", "Epoch 2663/10000, Training Loss: 0.06265124678611755, Test Loss: 0.06342026591300964\n", "Epoch 2664/10000, Training Loss: 0.06263697147369385, Test Loss: 0.06341784447431564\n", "Epoch 2665/10000, Training Loss: 0.06262265890836716, Test Loss: 0.06341550499200821\n", "Epoch 2666/10000, Training Loss: 0.06260839849710464, Test Loss: 0.06341278553009033\n", "Epoch 2667/10000, Training Loss: 0.06259408593177795, Test Loss: 0.06340958923101425\n", "Epoch 2668/10000, Training Loss: 0.06257986277341843, Test Loss: 0.06340592354536057\n", "Epoch 2669/10000, Training Loss: 0.0625656321644783, Test Loss: 0.06340207159519196\n", "Epoch 2670/10000, Training Loss: 0.06255139410495758, Test Loss: 0.06339796632528305\n", "Epoch 2671/10000, Training Loss: 0.06253711879253387, Test Loss: 0.06339365988969803\n", "Epoch 2672/10000, Training Loss: 0.06252294778823853, Test Loss: 0.06338930130004883\n", "Epoch 2673/10000, Training Loss: 0.06250868737697601, Test Loss: 0.0633854791522026\n", "Epoch 2674/10000, Training Loss: 0.06249453127384186, Test Loss: 0.06338240206241608\n", "Epoch 2675/10000, Training Loss: 0.06248030439019203, Test Loss: 0.06337980180978775\n", "Epoch 2676/10000, Training Loss: 0.062466125935316086, Test Loss: 0.0633774921298027\n", "Epoch 2677/10000, Training Loss: 0.06245194002985954, Test Loss: 0.0633748397231102\n", "Epoch 2678/10000, Training Loss: 0.062437787652015686, Test Loss: 0.06337135285139084\n", "Epoch 2679/10000, Training Loss: 0.06242360547184944, Test Loss: 0.06336746364831924\n", "Epoch 2680/10000, Training Loss: 0.06240943446755409, Test Loss: 0.06336333602666855\n", "Epoch 2681/10000, Training Loss: 0.062395304441452026, Test Loss: 0.06335929036140442\n", "Epoch 2682/10000, Training Loss: 0.06238120421767235, Test Loss: 0.06335538625717163\n", "Epoch 2683/10000, Training Loss: 0.06236705183982849, Test Loss: 0.0633518397808075\n", "Epoch 2684/10000, Training Loss: 0.06235295534133911, Test Loss: 0.06334863603115082\n", "Epoch 2685/10000, Training Loss: 0.06233884021639824, Test Loss: 0.06334575265645981\n", "Epoch 2686/10000, Training Loss: 0.06232472136616707, Test Loss: 0.06334274262189865\n", "Epoch 2687/10000, Training Loss: 0.06231063976883888, Test Loss: 0.06333956122398376\n", "Epoch 2688/10000, Training Loss: 0.0622965507209301, Test Loss: 0.06333611160516739\n", "Epoch 2689/10000, Training Loss: 0.0622824989259243, Test Loss: 0.0633329451084137\n", "Epoch 2690/10000, Training Loss: 0.062268439680337906, Test Loss: 0.0633297935128212\n", "Epoch 2691/10000, Training Loss: 0.06225435808300972, Test Loss: 0.06332682818174362\n", "Epoch 2692/10000, Training Loss: 0.06224030256271362, Test Loss: 0.06332390755414963\n", "Epoch 2693/10000, Training Loss: 0.062226276844739914, Test Loss: 0.06332093477249146\n", "Epoch 2694/10000, Training Loss: 0.06221224367618561, Test Loss: 0.06331773847341537\n", "Epoch 2695/10000, Training Loss: 0.0621982179582119, Test Loss: 0.06331446766853333\n", "Epoch 2696/10000, Training Loss: 0.06218418851494789, Test Loss: 0.06331111490726471\n", "Epoch 2697/10000, Training Loss: 0.06217021122574806, Test Loss: 0.06330738961696625\n", "Epoch 2698/10000, Training Loss: 0.062156155705451965, Test Loss: 0.06330335140228271\n", "Epoch 2699/10000, Training Loss: 0.062142208218574524, Test Loss: 0.06329936534166336\n", "Epoch 2700/10000, Training Loss: 0.062128208577632904, Test Loss: 0.06329590827226639\n", "Epoch 2701/10000, Training Loss: 0.06211427226662636, Test Loss: 0.06329306215047836\n", "Epoch 2702/10000, Training Loss: 0.06210027635097504, Test Loss: 0.06329049170017242\n", "Epoch 2703/10000, Training Loss: 0.06208629533648491, Test Loss: 0.06328823417425156\n", "Epoch 2704/10000, Training Loss: 0.06207239255309105, Test Loss: 0.06328584998846054\n", "Epoch 2705/10000, Training Loss: 0.06205844134092331, Test Loss: 0.06328311562538147\n", "Epoch 2706/10000, Training Loss: 0.062044501304626465, Test Loss: 0.06328006088733673\n", "Epoch 2707/10000, Training Loss: 0.062030576169490814, Test Loss: 0.06327634304761887\n", "Epoch 2708/10000, Training Loss: 0.062016669660806656, Test Loss: 0.06327220052480698\n", "Epoch 2709/10000, Training Loss: 0.06200274080038071, Test Loss: 0.063267782330513\n", "Epoch 2710/10000, Training Loss: 0.06198885664343834, Test Loss: 0.06326384842395782\n", "Epoch 2711/10000, Training Loss: 0.06197495758533478, Test Loss: 0.06326058506965637\n", "Epoch 2712/10000, Training Loss: 0.0619610995054245, Test Loss: 0.06325797736644745\n", "Epoch 2713/10000, Training Loss: 0.06194720044732094, Test Loss: 0.06325563788414001\n", "Epoch 2714/10000, Training Loss: 0.06193333864212036, Test Loss: 0.06325346976518631\n", "Epoch 2715/10000, Training Loss: 0.061919499188661575, Test Loss: 0.06325126439332962\n", "Epoch 2716/10000, Training Loss: 0.06190560758113861, Test Loss: 0.06324869394302368\n", "Epoch 2717/10000, Training Loss: 0.061891790479421616, Test Loss: 0.06324559450149536\n", "Epoch 2718/10000, Training Loss: 0.06187795102596283, Test Loss: 0.06324183195829391\n", "Epoch 2719/10000, Training Loss: 0.06186412274837494, Test Loss: 0.0632375180721283\n", "Epoch 2720/10000, Training Loss: 0.061850279569625854, Test Loss: 0.06323301047086716\n", "Epoch 2721/10000, Training Loss: 0.061836495995521545, Test Loss: 0.06322908401489258\n", "Epoch 2722/10000, Training Loss: 0.061822667717933655, Test Loss: 0.06322583556175232\n", "Epoch 2723/10000, Training Loss: 0.06180886551737785, Test Loss: 0.06322333216667175\n", "Epoch 2724/10000, Training Loss: 0.061795081943273544, Test Loss: 0.06322120875120163\n", "Epoch 2725/10000, Training Loss: 0.06178130954504013, Test Loss: 0.0632193312048912\n", "Epoch 2726/10000, Training Loss: 0.06176753342151642, Test Loss: 0.06321731209754944\n", "Epoch 2727/10000, Training Loss: 0.061753761023283005, Test Loss: 0.06321484595537186\n", "Epoch 2728/10000, Training Loss: 0.061740003526210785, Test Loss: 0.06321141868829727\n", "Epoch 2729/10000, Training Loss: 0.061726246029138565, Test Loss: 0.06320733577013016\n", "Epoch 2730/10000, Training Loss: 0.06171249970793724, Test Loss: 0.06320318579673767\n", "Epoch 2731/10000, Training Loss: 0.0616987943649292, Test Loss: 0.06319920718669891\n", "Epoch 2732/10000, Training Loss: 0.06168505921959877, Test Loss: 0.06319575011730194\n", "Epoch 2733/10000, Training Loss: 0.06167132034897804, Test Loss: 0.06319274008274078\n", "Epoch 2734/10000, Training Loss: 0.0616576187312603, Test Loss: 0.06318996101617813\n", "Epoch 2735/10000, Training Loss: 0.06164391338825226, Test Loss: 0.06318722665309906\n", "Epoch 2736/10000, Training Loss: 0.0616302490234375, Test Loss: 0.06318476051092148\n", "Epoch 2737/10000, Training Loss: 0.06161653622984886, Test Loss: 0.06318237632513046\n", "Epoch 2738/10000, Training Loss: 0.06160282716155052, Test Loss: 0.06317990273237228\n", "Epoch 2739/10000, Training Loss: 0.06158920377492905, Test Loss: 0.06317706406116486\n", "Epoch 2740/10000, Training Loss: 0.061575502157211304, Test Loss: 0.06317378580570221\n", "Epoch 2741/10000, Training Loss: 0.06156184524297714, Test Loss: 0.06317006051540375\n", "Epoch 2742/10000, Training Loss: 0.061548225581645966, Test Loss: 0.0631662905216217\n", "Epoch 2743/10000, Training Loss: 0.06153459474444389, Test Loss: 0.06316246092319489\n", "Epoch 2744/10000, Training Loss: 0.06152092665433884, Test Loss: 0.06315906345844269\n", "Epoch 2745/10000, Training Loss: 0.061507340520620346, Test Loss: 0.0631561130285263\n", "Epoch 2746/10000, Training Loss: 0.06149372085928917, Test Loss: 0.06315374374389648\n", "Epoch 2747/10000, Training Loss: 0.0614800825715065, Test Loss: 0.06315164268016815\n", "Epoch 2748/10000, Training Loss: 0.06146648898720741, Test Loss: 0.06314916163682938\n", "Epoch 2749/10000, Training Loss: 0.06145290285348892, Test Loss: 0.06314635276794434\n", "Epoch 2750/10000, Training Loss: 0.06143929064273834, Test Loss: 0.06314340233802795\n", "Epoch 2751/10000, Training Loss: 0.06142570450901985, Test Loss: 0.06314050406217575\n", "Epoch 2752/10000, Training Loss: 0.06141216307878494, Test Loss: 0.06313754618167877\n", "Epoch 2753/10000, Training Loss: 0.061398595571517944, Test Loss: 0.06313453614711761\n", "Epoch 2754/10000, Training Loss: 0.06138504669070244, Test Loss: 0.06313151121139526\n", "Epoch 2755/10000, Training Loss: 0.06137147173285484, Test Loss: 0.06312829256057739\n", "Epoch 2756/10000, Training Loss: 0.06135791167616844, Test Loss: 0.06312481313943863\n", "Epoch 2757/10000, Training Loss: 0.06134435161948204, Test Loss: 0.06312138587236404\n", "Epoch 2758/10000, Training Loss: 0.06133086606860161, Test Loss: 0.06311839073896408\n", "Epoch 2759/10000, Training Loss: 0.0613173171877861, Test Loss: 0.06311586499214172\n", "Epoch 2760/10000, Training Loss: 0.06130382418632507, Test Loss: 0.0631137266755104\n", "Epoch 2761/10000, Training Loss: 0.061290301382541656, Test Loss: 0.06311165541410446\n", "Epoch 2762/10000, Training Loss: 0.061276838183403015, Test Loss: 0.06310959905385971\n", "Epoch 2763/10000, Training Loss: 0.06126333400607109, Test Loss: 0.06310739368200302\n", "Epoch 2764/10000, Training Loss: 0.06124984845519066, Test Loss: 0.06310480087995529\n", "Epoch 2765/10000, Training Loss: 0.061236362904310226, Test Loss: 0.06310149282217026\n", "Epoch 2766/10000, Training Loss: 0.061222877353429794, Test Loss: 0.06309765577316284\n", "Epoch 2767/10000, Training Loss: 0.06120942533016205, Test Loss: 0.06309349089860916\n", "Epoch 2768/10000, Training Loss: 0.061195939779281616, Test Loss: 0.06308969855308533\n", "Epoch 2769/10000, Training Loss: 0.061182525008916855, Test Loss: 0.06308653205633163\n", "Epoch 2770/10000, Training Loss: 0.06116912141442299, Test Loss: 0.06308384239673615\n", "Epoch 2771/10000, Training Loss: 0.061155661940574646, Test Loss: 0.063081756234169\n", "Epoch 2772/10000, Training Loss: 0.06114223599433899, Test Loss: 0.06307985633611679\n", "Epoch 2773/10000, Training Loss: 0.061128806322813034, Test Loss: 0.06307797878980637\n", "Epoch 2774/10000, Training Loss: 0.06111537292599678, Test Loss: 0.06307539343833923\n", "Epoch 2775/10000, Training Loss: 0.0611020028591156, Test Loss: 0.06307222694158554\n", "Epoch 2776/10000, Training Loss: 0.06108853593468666, Test Loss: 0.06306856870651245\n", "Epoch 2777/10000, Training Loss: 0.06107518449425697, Test Loss: 0.0630650594830513\n", "Epoch 2778/10000, Training Loss: 0.06106181442737579, Test Loss: 0.06306189298629761\n", "Epoch 2779/10000, Training Loss: 0.06104843690991402, Test Loss: 0.06305902451276779\n", "Epoch 2780/10000, Training Loss: 0.06103510037064552, Test Loss: 0.063056580722332\n", "Epoch 2781/10000, Training Loss: 0.061021704226732254, Test Loss: 0.06305457651615143\n", "Epoch 2782/10000, Training Loss: 0.061008352786302567, Test Loss: 0.0630522072315216\n", "Epoch 2783/10000, Training Loss: 0.060994986444711685, Test Loss: 0.06304948776960373\n", "Epoch 2784/10000, Training Loss: 0.060981616377830505, Test Loss: 0.0630466565489769\n", "Epoch 2785/10000, Training Loss: 0.06096832826733589, Test Loss: 0.06304381042718887\n", "Epoch 2786/10000, Training Loss: 0.0609549954533577, Test Loss: 0.06304092705249786\n", "Epoch 2787/10000, Training Loss: 0.0609416626393795, Test Loss: 0.06303766369819641\n", "Epoch 2788/10000, Training Loss: 0.0609283521771431, Test Loss: 0.06303471326828003\n", "Epoch 2789/10000, Training Loss: 0.0609150305390358, Test Loss: 0.0630316212773323\n", "Epoch 2790/10000, Training Loss: 0.06090173125267029, Test Loss: 0.06302882730960846\n", "Epoch 2791/10000, Training Loss: 0.060888443142175674, Test Loss: 0.06302636116743088\n", "Epoch 2792/10000, Training Loss: 0.06087517365813255, Test Loss: 0.06302402168512344\n", "Epoch 2793/10000, Training Loss: 0.06086188554763794, Test Loss: 0.0630214735865593\n", "Epoch 2794/10000, Training Loss: 0.06084861233830452, Test Loss: 0.06301865726709366\n", "Epoch 2795/10000, Training Loss: 0.0608353354036808, Test Loss: 0.06301587074995041\n", "Epoch 2796/10000, Training Loss: 0.06082208454608917, Test Loss: 0.06301305443048477\n", "Epoch 2797/10000, Training Loss: 0.06080886349081993, Test Loss: 0.06301050633192062\n", "Epoch 2798/10000, Training Loss: 0.06079557538032532, Test Loss: 0.06300754845142365\n", "Epoch 2799/10000, Training Loss: 0.06078236177563667, Test Loss: 0.06300467997789383\n", "Epoch 2800/10000, Training Loss: 0.06076914444565773, Test Loss: 0.06300217658281326\n", "Epoch 2801/10000, Training Loss: 0.0607559010386467, Test Loss: 0.06299938261508942\n", "Epoch 2802/10000, Training Loss: 0.06074269488453865, Test Loss: 0.06299636512994766\n", "Epoch 2803/10000, Training Loss: 0.06072946637868881, Test Loss: 0.06299350410699844\n", "Epoch 2804/10000, Training Loss: 0.06071626767516136, Test Loss: 0.06299106776714325\n", "Epoch 2805/10000, Training Loss: 0.0607030875980854, Test Loss: 0.06298867613077164\n", "Epoch 2806/10000, Training Loss: 0.060689881443977356, Test Loss: 0.06298643350601196\n", "Epoch 2807/10000, Training Loss: 0.060676682740449905, Test Loss: 0.06298376619815826\n", "Epoch 2808/10000, Training Loss: 0.06066349148750305, Test Loss: 0.06298083066940308\n", "Epoch 2809/10000, Training Loss: 0.06065037474036217, Test Loss: 0.06297791749238968\n", "Epoch 2810/10000, Training Loss: 0.0606372207403183, Test Loss: 0.06297512352466583\n", "Epoch 2811/10000, Training Loss: 0.06062403321266174, Test Loss: 0.06297238916158676\n", "Epoch 2812/10000, Training Loss: 0.060610897839069366, Test Loss: 0.0629696324467659\n", "Epoch 2813/10000, Training Loss: 0.060597725212574005, Test Loss: 0.06296658515930176\n", "Epoch 2814/10000, Training Loss: 0.06058458238840103, Test Loss: 0.06296392530202866\n", "Epoch 2815/10000, Training Loss: 0.06057148799300194, Test Loss: 0.06296166777610779\n", "Epoch 2816/10000, Training Loss: 0.06055836379528046, Test Loss: 0.06295950710773468\n", "Epoch 2817/10000, Training Loss: 0.06054523587226868, Test Loss: 0.06295736879110336\n", "Epoch 2818/10000, Training Loss: 0.06053215637803078, Test Loss: 0.06295470148324966\n", "Epoch 2819/10000, Training Loss: 0.0605190210044384, Test Loss: 0.06295164674520493\n", "Epoch 2820/10000, Training Loss: 0.060505930334329605, Test Loss: 0.06294836848974228\n", "Epoch 2821/10000, Training Loss: 0.060492828488349915, Test Loss: 0.06294534355401993\n", "Epoch 2822/10000, Training Loss: 0.060479748994112015, Test Loss: 0.06294265389442444\n", "Epoch 2823/10000, Training Loss: 0.060466669499874115, Test Loss: 0.06294041126966476\n", "Epoch 2824/10000, Training Loss: 0.060453612357378006, Test Loss: 0.06293826550245285\n", "Epoch 2825/10000, Training Loss: 0.0604405403137207, Test Loss: 0.06293626874685287\n", "Epoch 2826/10000, Training Loss: 0.060427483171224594, Test Loss: 0.06293406337499619\n", "Epoch 2827/10000, Training Loss: 0.06041441485285759, Test Loss: 0.06293146312236786\n", "Epoch 2828/10000, Training Loss: 0.06040137633681297, Test Loss: 0.06292887777090073\n", "Epoch 2829/10000, Training Loss: 0.06038836017251015, Test Loss: 0.06292622536420822\n", "Epoch 2830/10000, Training Loss: 0.06037530303001404, Test Loss: 0.06292322278022766\n", "Epoch 2831/10000, Training Loss: 0.06036228686571121, Test Loss: 0.06292035430669785\n", "Epoch 2832/10000, Training Loss: 0.060349248349666595, Test Loss: 0.06291777640581131\n", "Epoch 2833/10000, Training Loss: 0.06033624708652496, Test Loss: 0.06291498243808746\n", "Epoch 2834/10000, Training Loss: 0.06032322719693184, Test Loss: 0.06291204690933228\n", "Epoch 2835/10000, Training Loss: 0.060310255736112595, Test Loss: 0.06290949136018753\n", "Epoch 2836/10000, Training Loss: 0.06029723957180977, Test Loss: 0.06290718913078308\n", "Epoch 2837/10000, Training Loss: 0.06028428301215172, Test Loss: 0.06290518492460251\n", "Epoch 2838/10000, Training Loss: 0.06027127429842949, Test Loss: 0.0629032552242279\n", "Epoch 2839/10000, Training Loss: 0.06025827303528786, Test Loss: 0.06290073692798615\n", "Epoch 2840/10000, Training Loss: 0.06024535000324249, Test Loss: 0.06289777904748917\n", "Epoch 2841/10000, Training Loss: 0.06023237854242325, Test Loss: 0.062894806265831\n", "Epoch 2842/10000, Training Loss: 0.060219403356313705, Test Loss: 0.0628918781876564\n", "Epoch 2843/10000, Training Loss: 0.06020648777484894, Test Loss: 0.06288907676935196\n", "Epoch 2844/10000, Training Loss: 0.06019352748990059, Test Loss: 0.0628863275051117\n", "Epoch 2845/10000, Training Loss: 0.06018058955669403, Test Loss: 0.0628834143280983\n", "Epoch 2846/10000, Training Loss: 0.06016767397522926, Test Loss: 0.06288085132837296\n", "Epoch 2847/10000, Training Loss: 0.06015472486615181, Test Loss: 0.06287873536348343\n", "Epoch 2848/10000, Training Loss: 0.06014179810881615, Test Loss: 0.06287679076194763\n", "Epoch 2849/10000, Training Loss: 0.06012888625264168, Test Loss: 0.06287490576505661\n", "Epoch 2850/10000, Training Loss: 0.0601160041987896, Test Loss: 0.06287217140197754\n", "Epoch 2851/10000, Training Loss: 0.06010309234261513, Test Loss: 0.06286899745464325\n", "Epoch 2852/10000, Training Loss: 0.06009017303586006, Test Loss: 0.06286575645208359\n", "Epoch 2853/10000, Training Loss: 0.060077324509620667, Test Loss: 0.06286276131868362\n", "Epoch 2854/10000, Training Loss: 0.06006446108222008, Test Loss: 0.06286005675792694\n", "Epoch 2855/10000, Training Loss: 0.060051560401916504, Test Loss: 0.06285770982503891\n", "Epoch 2856/10000, Training Loss: 0.060038674622774124, Test Loss: 0.0628553032875061\n", "Epoch 2857/10000, Training Loss: 0.06002579256892204, Test Loss: 0.06285308301448822\n", "Epoch 2858/10000, Training Loss: 0.060012977570295334, Test Loss: 0.0628507062792778\n", "Epoch 2859/10000, Training Loss: 0.06000012159347534, Test Loss: 0.06284892559051514\n", "Epoch 2860/10000, Training Loss: 0.05998727306723595, Test Loss: 0.06284720450639725\n", "Epoch 2861/10000, Training Loss: 0.059974417090415955, Test Loss: 0.06284526735544205\n", "Epoch 2862/10000, Training Loss: 0.05996161326766014, Test Loss: 0.06284303218126297\n", "Epoch 2863/10000, Training Loss: 0.05994877964258194, Test Loss: 0.06284045428037643\n", "Epoch 2864/10000, Training Loss: 0.059935975819826126, Test Loss: 0.06283756345510483\n", "Epoch 2865/10000, Training Loss: 0.05992316082119942, Test Loss: 0.0628342553973198\n", "Epoch 2866/10000, Training Loss: 0.05991034954786301, Test Loss: 0.06282909959554672\n", "Epoch 2867/10000, Training Loss: 0.059897538274526596, Test Loss: 0.06282328814268112\n", "Epoch 2868/10000, Training Loss: 0.059884753078222275, Test Loss: 0.06281796842813492\n", "Epoch 2869/10000, Training Loss: 0.059871967881917953, Test Loss: 0.06281358748674393\n", "Epoch 2870/10000, Training Loss: 0.059859175235033035, Test Loss: 0.06281045079231262\n", "Epoch 2871/10000, Training Loss: 0.059846434742212296, Test Loss: 0.06280884146690369\n", "Epoch 2872/10000, Training Loss: 0.059833623468875885, Test Loss: 0.0628083199262619\n", "Epoch 2873/10000, Training Loss: 0.059820886701345444, Test Loss: 0.0628083199262619\n", "Epoch 2874/10000, Training Loss: 0.05980810895562172, Test Loss: 0.0628076046705246\n", "Epoch 2875/10000, Training Loss: 0.059795353561639786, Test Loss: 0.06280584633350372\n", "Epoch 2876/10000, Training Loss: 0.059782613068819046, Test Loss: 0.06280509382486343\n", "Epoch 2877/10000, Training Loss: 0.0597698800265789, Test Loss: 0.06280467659235\n", "Epoch 2878/10000, Training Loss: 0.059757132083177567, Test Loss: 0.06280393898487091\n", "Epoch 2879/10000, Training Loss: 0.05974441394209862, Test Loss: 0.06280218064785004\n", "Epoch 2880/10000, Training Loss: 0.05973169952630997, Test Loss: 0.06279940158128738\n", "Epoch 2881/10000, Training Loss: 0.05971897020936012, Test Loss: 0.06279593706130981\n", "Epoch 2882/10000, Training Loss: 0.05970625579357147, Test Loss: 0.06279218196868896\n", "Epoch 2883/10000, Training Loss: 0.05969355255365372, Test Loss: 0.06278857588768005\n", "Epoch 2884/10000, Training Loss: 0.059680841863155365, Test Loss: 0.06278551369905472\n", "Epoch 2885/10000, Training Loss: 0.05966813489794731, Test Loss: 0.06278286129236221\n", "Epoch 2886/10000, Training Loss: 0.05965548753738403, Test Loss: 0.06277906894683838\n", "Epoch 2887/10000, Training Loss: 0.05964279919862747, Test Loss: 0.06277421116828918\n", "Epoch 2888/10000, Training Loss: 0.05963010713458061, Test Loss: 0.06276960670948029\n", "Epoch 2889/10000, Training Loss: 0.05961746349930763, Test Loss: 0.06276574730873108\n", "Epoch 2890/10000, Training Loss: 0.05960478633642197, Test Loss: 0.06276459991931915\n", "Epoch 2891/10000, Training Loss: 0.05959212779998779, Test Loss: 0.06276534497737885\n", "Epoch 2892/10000, Training Loss: 0.059579454362392426, Test Loss: 0.06276538223028183\n", "Epoch 2893/10000, Training Loss: 0.05956683307886124, Test Loss: 0.06276557594537735\n", "Epoch 2894/10000, Training Loss: 0.05955416336655617, Test Loss: 0.06276392191648483\n", "Epoch 2895/10000, Training Loss: 0.05954153835773468, Test Loss: 0.06276193261146545\n", "Epoch 2896/10000, Training Loss: 0.05952894315123558, Test Loss: 0.06275970488786697\n", "Epoch 2897/10000, Training Loss: 0.059516288340091705, Test Loss: 0.06275703758001328\n", "Epoch 2898/10000, Training Loss: 0.059503696858882904, Test Loss: 0.06275235116481781\n", "Epoch 2899/10000, Training Loss: 0.059491049498319626, Test Loss: 0.0627465769648552\n", "Epoch 2900/10000, Training Loss: 0.05947846546769142, Test Loss: 0.06274037063121796\n", "Epoch 2901/10000, Training Loss: 0.059465840458869934, Test Loss: 0.06273666024208069\n", "Epoch 2902/10000, Training Loss: 0.05945328250527382, Test Loss: 0.06273612380027771\n", "Epoch 2903/10000, Training Loss: 0.05944065749645233, Test Loss: 0.062737837433815\n", "Epoch 2904/10000, Training Loss: 0.059428080916404724, Test Loss: 0.06273993104696274\n", "Epoch 2905/10000, Training Loss: 0.059415485709905624, Test Loss: 0.06273949146270752\n", "Epoch 2906/10000, Training Loss: 0.05940292403101921, Test Loss: 0.06273666769266129\n", "Epoch 2907/10000, Training Loss: 0.05939037352800369, Test Loss: 0.06273169815540314\n", "Epoch 2908/10000, Training Loss: 0.05937783047556877, Test Loss: 0.06272706389427185\n", "Epoch 2909/10000, Training Loss: 0.05936526507139206, Test Loss: 0.06272339820861816\n", "Epoch 2910/10000, Training Loss: 0.059352677315473557, Test Loss: 0.06272101402282715\n", "Epoch 2911/10000, Training Loss: 0.059340160340070724, Test Loss: 0.06271961331367493\n", "Epoch 2912/10000, Training Loss: 0.059327613562345505, Test Loss: 0.06271729618310928\n", "Epoch 2913/10000, Training Loss: 0.05931505933403969, Test Loss: 0.06271400302648544\n", "Epoch 2914/10000, Training Loss: 0.059302546083927155, Test Loss: 0.06271059066057205\n", "Epoch 2915/10000, Training Loss: 0.05929002910852432, Test Loss: 0.06270734965801239\n", "Epoch 2916/10000, Training Loss: 0.059277500957250595, Test Loss: 0.0627061203122139\n", "Epoch 2917/10000, Training Loss: 0.05926498398184776, Test Loss: 0.06270632892847061\n", "Epoch 2918/10000, Training Loss: 0.059252478182315826, Test Loss: 0.06270712614059448\n", "Epoch 2919/10000, Training Loss: 0.05923997983336449, Test Loss: 0.06270706653594971\n", "Epoch 2920/10000, Training Loss: 0.059227485209703445, Test Loss: 0.06270413845777512\n", "Epoch 2921/10000, Training Loss: 0.0592149943113327, Test Loss: 0.06269925087690353\n", "Epoch 2922/10000, Training Loss: 0.05920252948999405, Test Loss: 0.06269504874944687\n", "Epoch 2923/10000, Training Loss: 0.05919000133872032, Test Loss: 0.06269188225269318\n", "Epoch 2924/10000, Training Loss: 0.05917754024267197, Test Loss: 0.06268814206123352\n", "Epoch 2925/10000, Training Loss: 0.059165108948946, Test Loss: 0.06268412619829178\n", "Epoch 2926/10000, Training Loss: 0.059152621775865555, Test Loss: 0.06268208473920822\n", "Epoch 2927/10000, Training Loss: 0.059140149503946304, Test Loss: 0.06268149614334106\n", "Epoch 2928/10000, Training Loss: 0.059127699583768845, Test Loss: 0.06268074363470078\n", "Epoch 2929/10000, Training Loss: 0.059115227311849594, Test Loss: 0.06267920881509781\n", "Epoch 2930/10000, Training Loss: 0.059102822095155716, Test Loss: 0.06267667561769485\n", "Epoch 2931/10000, Training Loss: 0.059090372174978256, Test Loss: 0.06267503648996353\n", "Epoch 2932/10000, Training Loss: 0.05907796323299408, Test Loss: 0.06267418712377548\n", "Epoch 2933/10000, Training Loss: 0.05906551331281662, Test Loss: 0.06267349421977997\n", "Epoch 2934/10000, Training Loss: 0.05905307084321976, Test Loss: 0.06267207860946655\n", "Epoch 2935/10000, Training Loss: 0.05904066562652588, Test Loss: 0.0626683160662651\n", "Epoch 2936/10000, Training Loss: 0.05902829393744469, Test Loss: 0.06266290694475174\n", "Epoch 2937/10000, Training Loss: 0.05901586636900902, Test Loss: 0.06265726685523987\n", "Epoch 2938/10000, Training Loss: 0.059003449976444244, Test Loss: 0.06265398859977722\n", "Epoch 2939/10000, Training Loss: 0.058991070836782455, Test Loss: 0.06265304237604141\n", "Epoch 2940/10000, Training Loss: 0.05897870287299156, Test Loss: 0.06265224516391754\n", "Epoch 2941/10000, Training Loss: 0.05896633118391037, Test Loss: 0.06265131384134293\n", "Epoch 2942/10000, Training Loss: 0.05895395204424858, Test Loss: 0.06265120208263397\n", "Epoch 2943/10000, Training Loss: 0.058941569179296494, Test Loss: 0.06265102326869965\n", "Epoch 2944/10000, Training Loss: 0.05892918258905411, Test Loss: 0.06264892965555191\n", "Epoch 2945/10000, Training Loss: 0.05891680344939232, Test Loss: 0.06264512240886688\n", "Epoch 2946/10000, Training Loss: 0.058904487639665604, Test Loss: 0.06264185905456543\n", "Epoch 2947/10000, Training Loss: 0.058892104774713516, Test Loss: 0.06263953447341919\n", "Epoch 2948/10000, Training Loss: 0.058879781514406204, Test Loss: 0.06263640522956848\n", "Epoch 2949/10000, Training Loss: 0.0588674433529377, Test Loss: 0.06263284385204315\n", "Epoch 2950/10000, Training Loss: 0.058855075389146805, Test Loss: 0.06263092905282974\n", "Epoch 2951/10000, Training Loss: 0.05884278565645218, Test Loss: 0.06263013184070587\n", "Epoch 2952/10000, Training Loss: 0.05883043259382248, Test Loss: 0.06262824684381485\n", "Epoch 2953/10000, Training Loss: 0.05881813168525696, Test Loss: 0.06262535601854324\n", "Epoch 2954/10000, Training Loss: 0.05880580097436905, Test Loss: 0.06262369453907013\n", "Epoch 2955/10000, Training Loss: 0.05879347398877144, Test Loss: 0.06262145936489105\n", "Epoch 2956/10000, Training Loss: 0.05878118425607681, Test Loss: 0.06262033432722092\n", "Epoch 2957/10000, Training Loss: 0.0587688572704792, Test Loss: 0.06261813640594482\n", "Epoch 2958/10000, Training Loss: 0.05875656008720398, Test Loss: 0.06261671334505081\n", "Epoch 2959/10000, Training Loss: 0.058744315057992935, Test Loss: 0.06261393427848816\n", "Epoch 2960/10000, Training Loss: 0.05873198062181473, Test Loss: 0.06261159479618073\n", "Epoch 2961/10000, Training Loss: 0.05871972441673279, Test Loss: 0.06260811537504196\n", "Epoch 2962/10000, Training Loss: 0.058707453310489655, Test Loss: 0.06260599195957184\n", "Epoch 2963/10000, Training Loss: 0.05869518965482712, Test Loss: 0.06260351091623306\n", "Epoch 2964/10000, Training Loss: 0.058682918548583984, Test Loss: 0.06260071694850922\n", "Epoch 2965/10000, Training Loss: 0.05867068096995354, Test Loss: 0.0625995323061943\n", "Epoch 2966/10000, Training Loss: 0.058658383786678314, Test Loss: 0.06259957700967789\n", "Epoch 2967/10000, Training Loss: 0.05864615738391876, Test Loss: 0.06259837001562119\n", "Epoch 2968/10000, Training Loss: 0.05863392353057861, Test Loss: 0.06259576976299286\n", "Epoch 2969/10000, Training Loss: 0.058621667325496674, Test Loss: 0.06259163469076157\n", "Epoch 2970/10000, Training Loss: 0.058609411120414734, Test Loss: 0.06258852779865265\n", "Epoch 2971/10000, Training Loss: 0.05859718844294548, Test Loss: 0.0625869557261467\n", "Epoch 2972/10000, Training Loss: 0.05858498439192772, Test Loss: 0.06258663535118103\n", "Epoch 2973/10000, Training Loss: 0.05857272446155548, Test Loss: 0.06258510053157806\n", "Epoch 2974/10000, Training Loss: 0.058560531586408615, Test Loss: 0.06258270889520645\n", "Epoch 2975/10000, Training Loss: 0.05854831635951996, Test Loss: 0.06257940828800201\n", "Epoch 2976/10000, Training Loss: 0.0585361048579216, Test Loss: 0.06257540732622147\n", "Epoch 2977/10000, Training Loss: 0.05852389708161354, Test Loss: 0.06257281452417374\n", "Epoch 2978/10000, Training Loss: 0.058511726558208466, Test Loss: 0.06257213652133942\n", "Epoch 2979/10000, Training Loss: 0.0584995411336422, Test Loss: 0.06257272511720657\n", "Epoch 2980/10000, Training Loss: 0.05848733335733414, Test Loss: 0.06257203221321106\n", "Epoch 2981/10000, Training Loss: 0.05847515910863876, Test Loss: 0.06256959587335587\n", "Epoch 2982/10000, Training Loss: 0.05846300721168518, Test Loss: 0.06256553530693054\n", "Epoch 2983/10000, Training Loss: 0.05845080688595772, Test Loss: 0.06256096810102463\n", "Epoch 2984/10000, Training Loss: 0.05843864381313324, Test Loss: 0.06255827099084854\n", "Epoch 2985/10000, Training Loss: 0.05842650681734085, Test Loss: 0.06255754083395004\n", "Epoch 2986/10000, Training Loss: 0.05841433256864548, Test Loss: 0.06255815178155899\n", "Epoch 2987/10000, Training Loss: 0.0584021620452404, Test Loss: 0.06255771219730377\n", "Epoch 2988/10000, Training Loss: 0.05839003995060921, Test Loss: 0.06255551427602768\n", "Epoch 2989/10000, Training Loss: 0.05837790295481682, Test Loss: 0.06255181133747101\n", "Epoch 2990/10000, Training Loss: 0.058365777134895325, Test Loss: 0.06254730373620987\n", "Epoch 2991/10000, Training Loss: 0.058353640139102936, Test Loss: 0.06254436820745468\n", "Epoch 2992/10000, Training Loss: 0.058341484516859055, Test Loss: 0.06254325807094574\n", "Epoch 2993/10000, Training Loss: 0.05832935497164726, Test Loss: 0.06254357099533081\n", "Epoch 2994/10000, Training Loss: 0.05831724777817726, Test Loss: 0.06254303455352783\n", "Epoch 2995/10000, Training Loss: 0.058305155485868454, Test Loss: 0.06254126876592636\n", "Epoch 2996/10000, Training Loss: 0.058293070644140244, Test Loss: 0.0625380203127861\n", "Epoch 2997/10000, Training Loss: 0.058280907571315765, Test Loss: 0.0625336766242981\n", "Epoch 2998/10000, Training Loss: 0.058268819004297256, Test Loss: 0.06253104656934738\n", "Epoch 2999/10000, Training Loss: 0.058256734162569046, Test Loss: 0.062530018389225\n", "Epoch 3000/10000, Training Loss: 0.058244649320840836, Test Loss: 0.0625288262963295\n", "Epoch 3001/10000, Training Loss: 0.05823257565498352, Test Loss: 0.06252671033143997\n", "Epoch 3002/10000, Training Loss: 0.05822048336267471, Test Loss: 0.06252404302358627\n", "Epoch 3003/10000, Training Loss: 0.05820842087268829, Test Loss: 0.0625227764248848\n", "Epoch 3004/10000, Training Loss: 0.05819633603096008, Test Loss: 0.06252264976501465\n", "Epoch 3005/10000, Training Loss: 0.05818428844213486, Test Loss: 0.06252150237560272\n", "Epoch 3006/10000, Training Loss: 0.05817221477627754, Test Loss: 0.06251908838748932\n", "Epoch 3007/10000, Training Loss: 0.058160144835710526, Test Loss: 0.06251522153615952\n", "Epoch 3008/10000, Training Loss: 0.05814814195036888, Test Loss: 0.06251227855682373\n", "Epoch 3009/10000, Training Loss: 0.05813606083393097, Test Loss: 0.06251086294651031\n", "Epoch 3010/10000, Training Loss: 0.058123983442783356, Test Loss: 0.06250914186239243\n", "Epoch 3011/10000, Training Loss: 0.05811198428273201, Test Loss: 0.06250713020563126\n", "Epoch 3012/10000, Training Loss: 0.05809991806745529, Test Loss: 0.06250480562448502\n", "Epoch 3013/10000, Training Loss: 0.05808790773153305, Test Loss: 0.06250350922346115\n", "Epoch 3014/10000, Training Loss: 0.058075912296772, Test Loss: 0.06250306218862534\n", "Epoch 3015/10000, Training Loss: 0.05806387588381767, Test Loss: 0.06250172108411789\n", "Epoch 3016/10000, Training Loss: 0.058051854372024536, Test Loss: 0.062499143183231354\n", "Epoch 3017/10000, Training Loss: 0.05803984776139259, Test Loss: 0.06249593570828438\n", "Epoch 3018/10000, Training Loss: 0.05802787095308304, Test Loss: 0.06249203905463219\n", "Epoch 3019/10000, Training Loss: 0.05801583454012871, Test Loss: 0.06248975172638893\n", "Epoch 3020/10000, Training Loss: 0.058003850281238556, Test Loss: 0.062489401549100876\n", "Epoch 3021/10000, Training Loss: 0.057991866022348404, Test Loss: 0.06248873844742775\n", "Epoch 3022/10000, Training Loss: 0.057979851961135864, Test Loss: 0.06248755753040314\n", "Epoch 3023/10000, Training Loss: 0.05796787142753601, Test Loss: 0.06248563155531883\n", "Epoch 3024/10000, Training Loss: 0.05795591324567795, Test Loss: 0.06248405948281288\n", "Epoch 3025/10000, Training Loss: 0.05794396623969078, Test Loss: 0.0624828077852726\n", "Epoch 3026/10000, Training Loss: 0.05793197080492973, Test Loss: 0.062480583786964417\n", "Epoch 3027/10000, Training Loss: 0.05792003124952316, Test Loss: 0.06247750669717789\n", "Epoch 3028/10000, Training Loss: 0.05790805071592331, Test Loss: 0.06247398257255554\n", "Epoch 3029/10000, Training Loss: 0.057896118611097336, Test Loss: 0.06247064471244812\n", "Epoch 3030/10000, Training Loss: 0.05788414552807808, Test Loss: 0.06246915087103844\n", "Epoch 3031/10000, Training Loss: 0.05787220597267151, Test Loss: 0.062469255179166794\n", "Epoch 3032/10000, Training Loss: 0.05786023288965225, Test Loss: 0.06246866285800934\n", "Epoch 3033/10000, Training Loss: 0.05784828960895538, Test Loss: 0.062467340379953384\n", "Epoch 3034/10000, Training Loss: 0.057836372405290604, Test Loss: 0.06246524304151535\n", "Epoch 3035/10000, Training Loss: 0.057824451476335526, Test Loss: 0.062462203204631805\n", "Epoch 3036/10000, Training Loss: 0.057812537997961044, Test Loss: 0.06246015429496765\n", "Epoch 3037/10000, Training Loss: 0.05780058726668358, Test Loss: 0.062459226697683334\n", "Epoch 3038/10000, Training Loss: 0.05778870731592178, Test Loss: 0.062457580119371414\n", "Epoch 3039/10000, Training Loss: 0.05777678266167641, Test Loss: 0.06245521083474159\n", "Epoch 3040/10000, Training Loss: 0.05776486173272133, Test Loss: 0.062452532351017\n", "Epoch 3041/10000, Training Loss: 0.057752951979637146, Test Loss: 0.062449727207422256\n", "Epoch 3042/10000, Training Loss: 0.05774109065532684, Test Loss: 0.062448274344205856\n", "Epoch 3043/10000, Training Loss: 0.05772918090224266, Test Loss: 0.06244809553027153\n", "Epoch 3044/10000, Training Loss: 0.057717278599739075, Test Loss: 0.06244698911905289\n", "Epoch 3045/10000, Training Loss: 0.057705409824848175, Test Loss: 0.06244511157274246\n", "Epoch 3046/10000, Training Loss: 0.057693518698215485, Test Loss: 0.06244261562824249\n", "Epoch 3047/10000, Training Loss: 0.05768166109919548, Test Loss: 0.062439799308776855\n", "Epoch 3048/10000, Training Loss: 0.05766977369785309, Test Loss: 0.06243827939033508\n", "Epoch 3049/10000, Training Loss: 0.057657890021800995, Test Loss: 0.06243645399808884\n", "Epoch 3050/10000, Training Loss: 0.05764603987336159, Test Loss: 0.06243554502725601\n", "Epoch 3051/10000, Training Loss: 0.05763420835137367, Test Loss: 0.06243378296494484\n", "Epoch 3052/10000, Training Loss: 0.05762232467532158, Test Loss: 0.06243155896663666\n", "Epoch 3053/10000, Training Loss: 0.057610511779785156, Test Loss: 0.06242915615439415\n", "Epoch 3054/10000, Training Loss: 0.057598602026700974, Test Loss: 0.06242665648460388\n", "Epoch 3055/10000, Training Loss: 0.057586804032325745, Test Loss: 0.06242542341351509\n", "Epoch 3056/10000, Training Loss: 0.057574957609176636, Test Loss: 0.06242532655596733\n", "Epoch 3057/10000, Training Loss: 0.05756313353776932, Test Loss: 0.062424324452877045\n", "Epoch 3058/10000, Training Loss: 0.0575513131916523, Test Loss: 0.06242227554321289\n", "Epoch 3059/10000, Training Loss: 0.057539455592632294, Test Loss: 0.062419530004262924\n", "Epoch 3060/10000, Training Loss: 0.05752766877412796, Test Loss: 0.06241649016737938\n", "Epoch 3061/10000, Training Loss: 0.057515837252140045, Test Loss: 0.0624132864177227\n", "Epoch 3062/10000, Training Loss: 0.05750401318073273, Test Loss: 0.062411829829216\n", "Epoch 3063/10000, Training Loss: 0.05749223753809929, Test Loss: 0.062412161380052567\n", "Epoch 3064/10000, Training Loss: 0.05748043581843376, Test Loss: 0.06241197511553764\n", "Epoch 3065/10000, Training Loss: 0.057468630373477936, Test Loss: 0.06241094693541527\n", "Epoch 3066/10000, Training Loss: 0.05745682865381241, Test Loss: 0.062408529222011566\n", "Epoch 3067/10000, Training Loss: 0.05744501203298569, Test Loss: 0.06240520998835564\n", "Epoch 3068/10000, Training Loss: 0.057433243840932846, Test Loss: 0.062401410192251205\n", "Epoch 3069/10000, Training Loss: 0.05742146819829941, Test Loss: 0.062399592250585556\n", "Epoch 3070/10000, Training Loss: 0.057409707456827164, Test Loss: 0.06239968165755272\n", "Epoch 3071/10000, Training Loss: 0.05739791691303253, Test Loss: 0.062399547547101974\n", "Epoch 3072/10000, Training Loss: 0.05738615617156029, Test Loss: 0.0623982734978199\n", "Epoch 3073/10000, Training Loss: 0.05737440288066864, Test Loss: 0.06239582598209381\n", "Epoch 3074/10000, Training Loss: 0.057362619787454605, Test Loss: 0.06239292770624161\n", "Epoch 3075/10000, Training Loss: 0.05735087767243385, Test Loss: 0.062390103936195374\n", "Epoch 3076/10000, Training Loss: 0.05733911693096161, Test Loss: 0.062387410551309586\n", "Epoch 3077/10000, Training Loss: 0.05732738599181175, Test Loss: 0.06238681823015213\n", "Epoch 3078/10000, Training Loss: 0.0573156364262104, Test Loss: 0.06238720566034317\n", "Epoch 3079/10000, Training Loss: 0.05730387195944786, Test Loss: 0.06238667294383049\n", "Epoch 3080/10000, Training Loss: 0.057292163372039795, Test Loss: 0.06238510087132454\n", "Epoch 3081/10000, Training Loss: 0.05728042498230934, Test Loss: 0.06238272413611412\n", "Epoch 3082/10000, Training Loss: 0.05726870149374008, Test Loss: 0.06237971782684326\n", "Epoch 3083/10000, Training Loss: 0.057256992906332016, Test Loss: 0.06237640231847763\n", "Epoch 3084/10000, Training Loss: 0.05724526196718216, Test Loss: 0.06237473338842392\n", "Epoch 3085/10000, Training Loss: 0.05723356083035469, Test Loss: 0.06237476319074631\n", "Epoch 3086/10000, Training Loss: 0.05722183734178543, Test Loss: 0.06237424165010452\n", "Epoch 3087/10000, Training Loss: 0.057210102677345276, Test Loss: 0.06237300857901573\n", "Epoch 3088/10000, Training Loss: 0.0571984238922596, Test Loss: 0.06237104907631874\n", "Epoch 3089/10000, Training Loss: 0.057186730206012726, Test Loss: 0.062368400394916534\n", "Epoch 3090/10000, Training Loss: 0.057175032794475555, Test Loss: 0.062365226447582245\n", "Epoch 3091/10000, Training Loss: 0.05716335400938988, Test Loss: 0.062362104654312134\n", "Epoch 3092/10000, Training Loss: 0.05715164169669151, Test Loss: 0.06235962733626366\n", "Epoch 3093/10000, Training Loss: 0.05713998153805733, Test Loss: 0.06235813722014427\n", "Epoch 3094/10000, Training Loss: 0.05712829530239105, Test Loss: 0.062358882278203964\n", "Epoch 3095/10000, Training Loss: 0.05711662396788597, Test Loss: 0.06236102432012558\n", "Epoch 3096/10000, Training Loss: 0.057104937732219696, Test Loss: 0.06236156448721886\n", "Epoch 3097/10000, Training Loss: 0.05709332227706909, Test Loss: 0.06235969066619873\n", "Epoch 3098/10000, Training Loss: 0.05708162486553192, Test Loss: 0.06235622987151146\n", "Epoch 3099/10000, Training Loss: 0.05706996098160744, Test Loss: 0.062351759523153305\n", "Epoch 3100/10000, Training Loss: 0.05705832317471504, Test Loss: 0.06234714388847351\n", "Epoch 3101/10000, Training Loss: 0.05704667419195175, Test Loss: 0.06234340742230415\n", "Epoch 3102/10000, Training Loss: 0.05703502148389816, Test Loss: 0.062342848628759384\n", "Epoch 3103/10000, Training Loss: 0.05702337250113487, Test Loss: 0.06234465911984444\n", "Epoch 3104/10000, Training Loss: 0.057011738419532776, Test Loss: 0.06234622374176979\n", "Epoch 3105/10000, Training Loss: 0.05700009688735008, Test Loss: 0.062346573919057846\n", "Epoch 3106/10000, Training Loss: 0.056988492608070374, Test Loss: 0.06234493851661682\n", "Epoch 3107/10000, Training Loss: 0.05697685852646828, Test Loss: 0.062341220676898956\n", "Epoch 3108/10000, Training Loss: 0.05696520954370499, Test Loss: 0.06233694776892662\n", "Epoch 3109/10000, Training Loss: 0.05695362761616707, Test Loss: 0.062332846224308014\n", "Epoch 3110/10000, Training Loss: 0.056941960006952286, Test Loss: 0.06232975795865059\n", "Epoch 3111/10000, Training Loss: 0.05693037062883377, Test Loss: 0.06232790648937225\n", "Epoch 3112/10000, Training Loss: 0.05691877380013466, Test Loss: 0.06232751905918121\n", "Epoch 3113/10000, Training Loss: 0.056907184422016144, Test Loss: 0.06232915818691254\n", "Epoch 3114/10000, Training Loss: 0.05689556896686554, Test Loss: 0.06233053281903267\n", "Epoch 3115/10000, Training Loss: 0.05688397213816643, Test Loss: 0.06233062595129013\n", "Epoch 3116/10000, Training Loss: 0.056872401386499405, Test Loss: 0.06232881546020508\n", "Epoch 3117/10000, Training Loss: 0.05686080828309059, Test Loss: 0.06232655048370361\n", "Epoch 3118/10000, Training Loss: 0.056849218904972076, Test Loss: 0.06232306733727455\n", "Epoch 3119/10000, Training Loss: 0.05683763325214386, Test Loss: 0.0623193234205246\n", "Epoch 3120/10000, Training Loss: 0.056826043874025345, Test Loss: 0.0623159296810627\n", "Epoch 3121/10000, Training Loss: 0.05681445449590683, Test Loss: 0.0623149611055851\n", "Epoch 3122/10000, Training Loss: 0.0568028949201107, Test Loss: 0.062314484268426895\n", "Epoch 3123/10000, Training Loss: 0.05679133161902428, Test Loss: 0.06231434643268585\n", "Epoch 3124/10000, Training Loss: 0.056779779493808746, Test Loss: 0.0623135045170784\n", "Epoch 3125/10000, Training Loss: 0.05676823481917381, Test Loss: 0.062311943620443344\n", "Epoch 3126/10000, Training Loss: 0.05675669386982918, Test Loss: 0.06230976805090904\n", "Epoch 3127/10000, Training Loss: 0.056745126843452454, Test Loss: 0.06230723112821579\n", "Epoch 3128/10000, Training Loss: 0.05673358961939812, Test Loss: 0.06230493634939194\n", "Epoch 3129/10000, Training Loss: 0.05672205612063408, Test Loss: 0.062303077429533005\n", "Epoch 3130/10000, Training Loss: 0.05671052262187004, Test Loss: 0.06230303272604942\n", "Epoch 3131/10000, Training Loss: 0.05669897049665451, Test Loss: 0.06230288743972778\n", "Epoch 3132/10000, Training Loss: 0.05668742582201958, Test Loss: 0.062302328646183014\n", "Epoch 3133/10000, Training Loss: 0.056675899773836136, Test Loss: 0.06230099871754646\n", "Epoch 3134/10000, Training Loss: 0.05666441097855568, Test Loss: 0.06229894608259201\n", "Epoch 3135/10000, Training Loss: 0.05665288120508194, Test Loss: 0.06229612976312637\n", "Epoch 3136/10000, Training Loss: 0.05664137005805969, Test Loss: 0.0622943751513958\n", "Epoch 3137/10000, Training Loss: 0.056629862636327744, Test Loss: 0.06229243054986\n", "Epoch 3138/10000, Training Loss: 0.05661835893988609, Test Loss: 0.06229068711400032\n", "Epoch 3139/10000, Training Loss: 0.05660686641931534, Test Loss: 0.062289074063301086\n", "Epoch 3140/10000, Training Loss: 0.05659536272287369, Test Loss: 0.06228768080472946\n", "Epoch 3141/10000, Training Loss: 0.05658388137817383, Test Loss: 0.062286555767059326\n", "Epoch 3142/10000, Training Loss: 0.05657234787940979, Test Loss: 0.06228537857532501\n", "Epoch 3143/10000, Training Loss: 0.05656088516116142, Test Loss: 0.06228527054190636\n", "Epoch 3144/10000, Training Loss: 0.05654938891530037, Test Loss: 0.06228465959429741\n", "Epoch 3145/10000, Training Loss: 0.05653791502118111, Test Loss: 0.06228339672088623\n", "Epoch 3146/10000, Training Loss: 0.05652644485235214, Test Loss: 0.06228141114115715\n", "Epoch 3147/10000, Training Loss: 0.05651497468352318, Test Loss: 0.06227864697575569\n", "Epoch 3148/10000, Training Loss: 0.056503504514694214, Test Loss: 0.0622757226228714\n", "Epoch 3149/10000, Training Loss: 0.056491997092962265, Test Loss: 0.06227337568998337\n", "Epoch 3150/10000, Training Loss: 0.056480541825294495, Test Loss: 0.06227182224392891\n", "Epoch 3151/10000, Training Loss: 0.056469064205884933, Test Loss: 0.06227099895477295\n", "Epoch 3152/10000, Training Loss: 0.05645766109228134, Test Loss: 0.06227056682109833\n", "Epoch 3153/10000, Training Loss: 0.05644620209932327, Test Loss: 0.062271300703287125\n", "Epoch 3154/10000, Training Loss: 0.056434739381074905, Test Loss: 0.0622711107134819\n", "Epoch 3155/10000, Training Loss: 0.05642329528927803, Test Loss: 0.06226994842290878\n", "Epoch 3156/10000, Training Loss: 0.05641184374690056, Test Loss: 0.06226787716150284\n", "Epoch 3157/10000, Training Loss: 0.056400444358587265, Test Loss: 0.062265101820230484\n", "Epoch 3158/10000, Training Loss: 0.056388989090919495, Test Loss: 0.06226186081767082\n", "Epoch 3159/10000, Training Loss: 0.05637755244970322, Test Loss: 0.06225927919149399\n", "Epoch 3160/10000, Training Loss: 0.056366175413131714, Test Loss: 0.06225727126002312\n", "Epoch 3161/10000, Training Loss: 0.05635471269488335, Test Loss: 0.06225631758570671\n", "Epoch 3162/10000, Training Loss: 0.05634329468011856, Test Loss: 0.06225601211190224\n", "Epoch 3163/10000, Training Loss: 0.05633184313774109, Test Loss: 0.06225686892867088\n", "Epoch 3164/10000, Training Loss: 0.05632048845291138, Test Loss: 0.062257327139377594\n", "Epoch 3165/10000, Training Loss: 0.05630908161401749, Test Loss: 0.062256380915641785\n", "Epoch 3166/10000, Training Loss: 0.0562976710498333, Test Loss: 0.06225438416004181\n", "Epoch 3167/10000, Training Loss: 0.05628630146384239, Test Loss: 0.062251556664705276\n", "Epoch 3168/10000, Training Loss: 0.05627485364675522, Test Loss: 0.06224856153130531\n", "Epoch 3169/10000, Training Loss: 0.056263480335474014, Test Loss: 0.06224602833390236\n", "Epoch 3170/10000, Training Loss: 0.05625208467245102, Test Loss: 0.06224381923675537\n", "Epoch 3171/10000, Training Loss: 0.05624071881175041, Test Loss: 0.06224231794476509\n", "Epoch 3172/10000, Training Loss: 0.0562293641269207, Test Loss: 0.062241606414318085\n", "Epoch 3173/10000, Training Loss: 0.056217942386865616, Test Loss: 0.06224147975444794\n", "Epoch 3174/10000, Training Loss: 0.056206561625003815, Test Loss: 0.06224147230386734\n", "Epoch 3175/10000, Training Loss: 0.05619516223669052, Test Loss: 0.062241051346063614\n", "Epoch 3176/10000, Training Loss: 0.056183867156505585, Test Loss: 0.06224016100168228\n", "Epoch 3177/10000, Training Loss: 0.05617247149348259, Test Loss: 0.06223854795098305\n", "Epoch 3178/10000, Training Loss: 0.05616109073162079, Test Loss: 0.0622362457215786\n", "Epoch 3179/10000, Training Loss: 0.056149717420339584, Test Loss: 0.062233589589595795\n", "Epoch 3180/10000, Training Loss: 0.05613837391138077, Test Loss: 0.06223229691386223\n", "Epoch 3181/10000, Training Loss: 0.056127019226551056, Test Loss: 0.062231432646512985\n", "Epoch 3182/10000, Training Loss: 0.05611569061875343, Test Loss: 0.06223056837916374\n", "Epoch 3183/10000, Training Loss: 0.05610434710979462, Test Loss: 0.062229715287685394\n", "Epoch 3184/10000, Training Loss: 0.056093037128448486, Test Loss: 0.06222857907414436\n", "Epoch 3185/10000, Training Loss: 0.05608166381716728, Test Loss: 0.062227193266153336\n", "Epoch 3186/10000, Training Loss: 0.056070372462272644, Test Loss: 0.06222546845674515\n", "Epoch 3187/10000, Training Loss: 0.05605902522802353, Test Loss: 0.06222346052527428\n", "Epoch 3188/10000, Training Loss: 0.05604769289493561, Test Loss: 0.062221430242061615\n", "Epoch 3189/10000, Training Loss: 0.056036364287137985, Test Loss: 0.06221951171755791\n", "Epoch 3190/10000, Training Loss: 0.056025099009275436, Test Loss: 0.06221797317266464\n", "Epoch 3191/10000, Training Loss: 0.05601375550031662, Test Loss: 0.062217190861701965\n", "Epoch 3192/10000, Training Loss: 0.0560024119913578, Test Loss: 0.062216922640800476\n", "Epoch 3193/10000, Training Loss: 0.05599111691117287, Test Loss: 0.0622168593108654\n", "Epoch 3194/10000, Training Loss: 0.05597979947924614, Test Loss: 0.06221640110015869\n", "Epoch 3195/10000, Training Loss: 0.05596849322319031, Test Loss: 0.062215354293584824\n", "Epoch 3196/10000, Training Loss: 0.05595724284648895, Test Loss: 0.06221387907862663\n", "Epoch 3197/10000, Training Loss: 0.05594590678811073, Test Loss: 0.06221188232302666\n", "Epoch 3198/10000, Training Loss: 0.05593461915850639, Test Loss: 0.0622093640267849\n", "Epoch 3199/10000, Training Loss: 0.05592332035303116, Test Loss: 0.06220674514770508\n", "Epoch 3200/10000, Training Loss: 0.055912043899297714, Test Loss: 0.06220462918281555\n", "Epoch 3201/10000, Training Loss: 0.055900778621435165, Test Loss: 0.062203243374824524\n", "Epoch 3202/10000, Training Loss: 0.05588947981595993, Test Loss: 0.06220293790102005\n", "Epoch 3203/10000, Training Loss: 0.05587819963693619, Test Loss: 0.06220324710011482\n", "Epoch 3204/10000, Training Loss: 0.05586693435907364, Test Loss: 0.06220352277159691\n", "Epoch 3205/10000, Training Loss: 0.055855680257081985, Test Loss: 0.06220339611172676\n", "Epoch 3206/10000, Training Loss: 0.055844422429800034, Test Loss: 0.062202345579862595\n", "Epoch 3207/10000, Training Loss: 0.05583316087722778, Test Loss: 0.06220047548413277\n", "Epoch 3208/10000, Training Loss: 0.055821869522333145, Test Loss: 0.06219783052802086\n", "Epoch 3209/10000, Training Loss: 0.055810656398534775, Test Loss: 0.06219496577978134\n", "Epoch 3210/10000, Training Loss: 0.05579938739538193, Test Loss: 0.06219245120882988\n", "Epoch 3211/10000, Training Loss: 0.05578816682100296, Test Loss: 0.062190838158130646\n", "Epoch 3212/10000, Training Loss: 0.05577687546610832, Test Loss: 0.062190305441617966\n", "Epoch 3213/10000, Training Loss: 0.05576566606760025, Test Loss: 0.062190551310777664\n", "Epoch 3214/10000, Training Loss: 0.05575442686676979, Test Loss: 0.06219077482819557\n", "Epoch 3215/10000, Training Loss: 0.05574321001768112, Test Loss: 0.06219082698225975\n", "Epoch 3216/10000, Training Loss: 0.05573195219039917, Test Loss: 0.062189821153879166\n", "Epoch 3217/10000, Training Loss: 0.0557207353413105, Test Loss: 0.06218777224421501\n", "Epoch 3218/10000, Training Loss: 0.05570954084396362, Test Loss: 0.06218526139855385\n", "Epoch 3219/10000, Training Loss: 0.05569827929139137, Test Loss: 0.06218304485082626\n", "Epoch 3220/10000, Training Loss: 0.0556870698928833, Test Loss: 0.06218131259083748\n", "Epoch 3221/10000, Training Loss: 0.05567585676908493, Test Loss: 0.062180377542972565\n", "Epoch 3222/10000, Training Loss: 0.055664658546447754, Test Loss: 0.06217987835407257\n", "Epoch 3223/10000, Training Loss: 0.05565343052148819, Test Loss: 0.06217949464917183\n", "Epoch 3224/10000, Training Loss: 0.055642254650592804, Test Loss: 0.062178801745176315\n", "Epoch 3225/10000, Training Loss: 0.055631015449762344, Test Loss: 0.062177810817956924\n", "Epoch 3226/10000, Training Loss: 0.055619824677705765, Test Loss: 0.062176309525966644\n", "Epoch 3227/10000, Training Loss: 0.05560864135622978, Test Loss: 0.0621747151017189\n", "Epoch 3228/10000, Training Loss: 0.0555974505841732, Test Loss: 0.06217312440276146\n", "Epoch 3229/10000, Training Loss: 0.055586256086826324, Test Loss: 0.06217186152935028\n", "Epoch 3230/10000, Training Loss: 0.05557507649064064, Test Loss: 0.06217092648148537\n", "Epoch 3231/10000, Training Loss: 0.055563874542713165, Test Loss: 0.06217016279697418\n", "Epoch 3232/10000, Training Loss: 0.05555269867181778, Test Loss: 0.062169190496206284\n", "Epoch 3233/10000, Training Loss: 0.05554156005382538, Test Loss: 0.062167976051568985\n", "Epoch 3234/10000, Training Loss: 0.055530376732349396, Test Loss: 0.0621664933860302\n", "Epoch 3235/10000, Training Loss: 0.05551920086145401, Test Loss: 0.062165312469005585\n", "Epoch 3236/10000, Training Loss: 0.05550802871584892, Test Loss: 0.062164127826690674\n", "Epoch 3237/10000, Training Loss: 0.05549687147140503, Test Loss: 0.06216326355934143\n", "Epoch 3238/10000, Training Loss: 0.05548572540283203, Test Loss: 0.062162432819604874\n", "Epoch 3239/10000, Training Loss: 0.055474575608968735, Test Loss: 0.062161635607481\n", "Epoch 3240/10000, Training Loss: 0.05546344816684723, Test Loss: 0.06216049939393997\n", "Epoch 3241/10000, Training Loss: 0.05545225739479065, Test Loss: 0.062158942222595215\n", "Epoch 3242/10000, Training Loss: 0.05544110760092735, Test Loss: 0.06215711683034897\n", "Epoch 3243/10000, Training Loss: 0.05542996898293495, Test Loss: 0.06215570494532585\n", "Epoch 3244/10000, Training Loss: 0.05541881546378136, Test Loss: 0.062154870480298996\n", "Epoch 3245/10000, Training Loss: 0.05540772154927254, Test Loss: 0.06215432658791542\n", "Epoch 3246/10000, Training Loss: 0.055396564304828644, Test Loss: 0.062153905630111694\n", "Epoch 3247/10000, Training Loss: 0.055385418236255646, Test Loss: 0.06215333193540573\n", "Epoch 3248/10000, Training Loss: 0.05537429079413414, Test Loss: 0.06215228512883186\n", "Epoch 3249/10000, Training Loss: 0.05536317825317383, Test Loss: 0.062150612473487854\n", "Epoch 3250/10000, Training Loss: 0.05535205081105232, Test Loss: 0.06214851513504982\n", "Epoch 3251/10000, Training Loss: 0.05534093827009201, Test Loss: 0.062146950513124466\n", "Epoch 3252/10000, Training Loss: 0.05532979965209961, Test Loss: 0.06214587390422821\n", "Epoch 3253/10000, Training Loss: 0.05531872436404228, Test Loss: 0.062145330011844635\n", "Epoch 3254/10000, Training Loss: 0.055307596921920776, Test Loss: 0.06214497238397598\n", "Epoch 3255/10000, Training Loss: 0.05529654026031494, Test Loss: 0.06214465945959091\n", "Epoch 3256/10000, Training Loss: 0.05528539791703224, Test Loss: 0.06214366853237152\n", "Epoch 3257/10000, Training Loss: 0.05527434125542641, Test Loss: 0.06214217469096184\n", "Epoch 3258/10000, Training Loss: 0.05526318401098251, Test Loss: 0.06214040890336037\n", "Epoch 3259/10000, Training Loss: 0.05525210127234459, Test Loss: 0.062138888984918594\n", "Epoch 3260/10000, Training Loss: 0.055241044610738754, Test Loss: 0.06213778629899025\n", "Epoch 3261/10000, Training Loss: 0.05522996187210083, Test Loss: 0.06213713437318802\n", "Epoch 3262/10000, Training Loss: 0.055218879133462906, Test Loss: 0.06213661655783653\n", "Epoch 3263/10000, Training Loss: 0.055207766592502594, Test Loss: 0.062136247754096985\n", "Epoch 3264/10000, Training Loss: 0.055196717381477356, Test Loss: 0.06213516369462013\n", "Epoch 3265/10000, Training Loss: 0.05518563464283943, Test Loss: 0.062133628875017166\n", "Epoch 3266/10000, Training Loss: 0.0551745742559433, Test Loss: 0.06213198974728584\n", "Epoch 3267/10000, Training Loss: 0.05516349896788597, Test Loss: 0.062130674719810486\n", "Epoch 3268/10000, Training Loss: 0.055152423679828644, Test Loss: 0.06212978437542915\n", "Epoch 3269/10000, Training Loss: 0.05514136701822281, Test Loss: 0.06212921068072319\n", "Epoch 3270/10000, Training Loss: 0.05513030290603638, Test Loss: 0.06212877109646797\n", "Epoch 3271/10000, Training Loss: 0.05511925369501114, Test Loss: 0.06212787702679634\n", "Epoch 3272/10000, Training Loss: 0.0551082119345665, Test Loss: 0.06212664395570755\n", "Epoch 3273/10000, Training Loss: 0.05509721115231514, Test Loss: 0.06212544068694115\n", "Epoch 3274/10000, Training Loss: 0.05508610978722572, Test Loss: 0.06212432682514191\n", "Epoch 3275/10000, Training Loss: 0.055075086653232574, Test Loss: 0.062123462557792664\n", "Epoch 3276/10000, Training Loss: 0.055064037442207336, Test Loss: 0.062122419476509094\n", "Epoch 3277/10000, Training Loss: 0.05505300685763359, Test Loss: 0.062121517956256866\n", "Epoch 3278/10000, Training Loss: 0.05504196137189865, Test Loss: 0.06212039291858673\n", "Epoch 3279/10000, Training Loss: 0.05503092333674431, Test Loss: 0.06211938336491585\n", "Epoch 3280/10000, Training Loss: 0.05501990765333176, Test Loss: 0.0621185228228569\n", "Epoch 3281/10000, Training Loss: 0.055008888244628906, Test Loss: 0.06211753189563751\n", "Epoch 3282/10000, Training Loss: 0.05499790236353874, Test Loss: 0.06211675703525543\n", "Epoch 3283/10000, Training Loss: 0.05498684570193291, Test Loss: 0.062115687876939774\n", "Epoch 3284/10000, Training Loss: 0.05497584119439125, Test Loss: 0.062114834785461426\n", "Epoch 3285/10000, Training Loss: 0.0549648217856884, Test Loss: 0.06211407855153084\n", "Epoch 3286/10000, Training Loss: 0.054953817278146744, Test Loss: 0.06211318448185921\n", "Epoch 3287/10000, Training Loss: 0.054942816495895386, Test Loss: 0.06211191043257713\n", "Epoch 3288/10000, Training Loss: 0.05493181198835373, Test Loss: 0.06211056932806969\n", "Epoch 3289/10000, Training Loss: 0.05492079257965088, Test Loss: 0.062109287828207016\n", "Epoch 3290/10000, Training Loss: 0.054909780621528625, Test Loss: 0.062108490616083145\n", "Epoch 3291/10000, Training Loss: 0.05489882081747055, Test Loss: 0.06210802122950554\n", "Epoch 3292/10000, Training Loss: 0.05488782376050949, Test Loss: 0.0621076337993145\n", "Epoch 3293/10000, Training Loss: 0.05487683042883873, Test Loss: 0.06210720166563988\n", "Epoch 3294/10000, Training Loss: 0.05486582964658737, Test Loss: 0.062106627970933914\n", "Epoch 3295/10000, Training Loss: 0.05485488101840019, Test Loss: 0.062105536460876465\n", "Epoch 3296/10000, Training Loss: 0.05484389513731003, Test Loss: 0.062103837728500366\n", "Epoch 3297/10000, Training Loss: 0.05483289062976837, Test Loss: 0.06210191175341606\n", "Epoch 3298/10000, Training Loss: 0.0548219308257103, Test Loss: 0.06210026517510414\n", "Epoch 3299/10000, Training Loss: 0.05481094494462013, Test Loss: 0.062099114060401917\n", "Epoch 3300/10000, Training Loss: 0.05479996278882027, Test Loss: 0.062098801136016846\n", "Epoch 3301/10000, Training Loss: 0.05478903278708458, Test Loss: 0.06209902465343475\n", "Epoch 3302/10000, Training Loss: 0.054778091609478, Test Loss: 0.062099259346723557\n", "Epoch 3303/10000, Training Loss: 0.05476711317896843, Test Loss: 0.06209909915924072\n", "Epoch 3304/10000, Training Loss: 0.05475614592432976, Test Loss: 0.06209844723343849\n", "Epoch 3305/10000, Training Loss: 0.05474518984556198, Test Loss: 0.062097128480672836\n", "Epoch 3306/10000, Training Loss: 0.0547342412173748, Test Loss: 0.06209532916545868\n", "Epoch 3307/10000, Training Loss: 0.054723307490348816, Test Loss: 0.06209343671798706\n", "Epoch 3308/10000, Training Loss: 0.05471237003803253, Test Loss: 0.062091559171676636\n", "Epoch 3309/10000, Training Loss: 0.054701417684555054, Test Loss: 0.06209016963839531\n", "Epoch 3310/10000, Training Loss: 0.05469048023223877, Test Loss: 0.06208942458033562\n", "Epoch 3311/10000, Training Loss: 0.05467953905463219, Test Loss: 0.06208944320678711\n", "Epoch 3312/10000, Training Loss: 0.054668623954057693, Test Loss: 0.062089912593364716\n", "Epoch 3313/10000, Training Loss: 0.054657720029354095, Test Loss: 0.06209004670381546\n", "Epoch 3314/10000, Training Loss: 0.05464676395058632, Test Loss: 0.06208951398730278\n", "Epoch 3315/10000, Training Loss: 0.05463581904768944, Test Loss: 0.06208857148885727\n", "Epoch 3316/10000, Training Loss: 0.054624926298856735, Test Loss: 0.06208684295415878\n", "Epoch 3317/10000, Training Loss: 0.05461399257183075, Test Loss: 0.0620848573744297\n", "Epoch 3318/10000, Training Loss: 0.05460311099886894, Test Loss: 0.06208299472928047\n", "Epoch 3319/10000, Training Loss: 0.05459216609597206, Test Loss: 0.06208191439509392\n", "Epoch 3320/10000, Training Loss: 0.05458129942417145, Test Loss: 0.06208174303174019\n", "Epoch 3321/10000, Training Loss: 0.05457035079598427, Test Loss: 0.062081947922706604\n", "Epoch 3322/10000, Training Loss: 0.05455943942070007, Test Loss: 0.062082238495349884\n", "Epoch 3323/10000, Training Loss: 0.05454854294657707, Test Loss: 0.062082067131996155\n", "Epoch 3324/10000, Training Loss: 0.05453762784600258, Test Loss: 0.06208120286464691\n", "Epoch 3325/10000, Training Loss: 0.05452677980065346, Test Loss: 0.062079571187496185\n", "Epoch 3326/10000, Training Loss: 0.054515864700078964, Test Loss: 0.062077559530735016\n", "Epoch 3327/10000, Training Loss: 0.054504986852407455, Test Loss: 0.06207595020532608\n", "Epoch 3328/10000, Training Loss: 0.05449408292770386, Test Loss: 0.06207503005862236\n", "Epoch 3329/10000, Training Loss: 0.054483216255903244, Test Loss: 0.0620746985077858\n", "Epoch 3330/10000, Training Loss: 0.05447230488061905, Test Loss: 0.06207454204559326\n", "Epoch 3331/10000, Training Loss: 0.054461460560560226, Test Loss: 0.06207466125488281\n", "Epoch 3332/10000, Training Loss: 0.05445060133934021, Test Loss: 0.0620742030441761\n", "Epoch 3333/10000, Training Loss: 0.05443970486521721, Test Loss: 0.06207329407334328\n", "Epoch 3334/10000, Training Loss: 0.054428838193416595, Test Loss: 0.062071915715932846\n", "Epoch 3335/10000, Training Loss: 0.05441795289516449, Test Loss: 0.06207037717103958\n", "Epoch 3336/10000, Training Loss: 0.054407086223363876, Test Loss: 0.062069252133369446\n", "Epoch 3337/10000, Training Loss: 0.054396241903305054, Test Loss: 0.0620686337351799\n", "Epoch 3338/10000, Training Loss: 0.05438537523150444, Test Loss: 0.062068380415439606\n", "Epoch 3339/10000, Training Loss: 0.05437454208731651, Test Loss: 0.06206831708550453\n", "Epoch 3340/10000, Training Loss: 0.05436364561319351, Test Loss: 0.062068138271570206\n", "Epoch 3341/10000, Training Loss: 0.05435282364487648, Test Loss: 0.06206771358847618\n", "Epoch 3342/10000, Training Loss: 0.05434199050068855, Test Loss: 0.062066834419965744\n", "Epoch 3343/10000, Training Loss: 0.054331161081790924, Test Loss: 0.06206535920500755\n", "Epoch 3344/10000, Training Loss: 0.054320327937603, Test Loss: 0.06206363067030907\n", "Epoch 3345/10000, Training Loss: 0.05430946871638298, Test Loss: 0.06206206604838371\n", "Epoch 3346/10000, Training Loss: 0.05429859831929207, Test Loss: 0.06206076964735985\n", "Epoch 3347/10000, Training Loss: 0.05428781360387802, Test Loss: 0.06206018477678299\n", "Epoch 3348/10000, Training Loss: 0.0542769618332386, Test Loss: 0.06206047162413597\n", "Epoch 3349/10000, Training Loss: 0.054266128689050674, Test Loss: 0.062061015516519547\n", "Epoch 3350/10000, Training Loss: 0.05425530672073364, Test Loss: 0.062061384320259094\n", "Epoch 3351/10000, Training Loss: 0.054244499653577805, Test Loss: 0.062061239033937454\n", "Epoch 3352/10000, Training Loss: 0.054233670234680176, Test Loss: 0.06205999478697777\n", "Epoch 3353/10000, Training Loss: 0.054222870618104935, Test Loss: 0.06205802783370018\n", "Epoch 3354/10000, Training Loss: 0.05421203747391701, Test Loss: 0.06205601990222931\n", "Epoch 3355/10000, Training Loss: 0.05420120805501938, Test Loss: 0.06205447018146515\n", "Epoch 3356/10000, Training Loss: 0.05419045314192772, Test Loss: 0.06205383315682411\n", "Epoch 3357/10000, Training Loss: 0.05417961627244949, Test Loss: 0.06205394119024277\n", "Epoch 3358/10000, Training Loss: 0.05416882783174515, Test Loss: 0.06205447018146515\n", "Epoch 3359/10000, Training Loss: 0.05415802821516991, Test Loss: 0.06205494701862335\n", "Epoch 3360/10000, Training Loss: 0.054147232323884964, Test Loss: 0.06205461546778679\n", "Epoch 3361/10000, Training Loss: 0.054136428982019424, Test Loss: 0.06205322965979576\n", "Epoch 3362/10000, Training Loss: 0.054125625640153885, Test Loss: 0.06205146014690399\n", "Epoch 3363/10000, Training Loss: 0.05411482974886894, Test Loss: 0.06204988434910774\n", "Epoch 3364/10000, Training Loss: 0.05410405620932579, Test Loss: 0.06204869598150253\n", "Epoch 3365/10000, Training Loss: 0.054093290120363235, Test Loss: 0.06204806640744209\n", "Epoch 3366/10000, Training Loss: 0.0540824830532074, Test Loss: 0.062048058956861496\n", "Epoch 3367/10000, Training Loss: 0.05407170578837395, Test Loss: 0.06204812973737717\n", "Epoch 3368/10000, Training Loss: 0.054060954600572586, Test Loss: 0.062047991901636124\n", "Epoch 3369/10000, Training Loss: 0.05405018478631973, Test Loss: 0.06204739212989807\n", "Epoch 3370/10000, Training Loss: 0.05403939634561539, Test Loss: 0.06204664707183838\n", "Epoch 3371/10000, Training Loss: 0.05402863025665283, Test Loss: 0.06204581633210182\n", "Epoch 3372/10000, Training Loss: 0.05401787906885147, Test Loss: 0.06204502284526825\n", "Epoch 3373/10000, Training Loss: 0.054007116705179214, Test Loss: 0.062044352293014526\n", "Epoch 3374/10000, Training Loss: 0.053996335715055466, Test Loss: 0.06204371154308319\n", "Epoch 3375/10000, Training Loss: 0.053985584527254105, Test Loss: 0.062042854726314545\n", "Epoch 3376/10000, Training Loss: 0.05397482588887215, Test Loss: 0.06204182282090187\n", "Epoch 3377/10000, Training Loss: 0.05396406725049019, Test Loss: 0.06204097718000412\n", "Epoch 3378/10000, Training Loss: 0.053953323513269424, Test Loss: 0.06204048916697502\n", "Epoch 3379/10000, Training Loss: 0.053942590951919556, Test Loss: 0.0620403029024601\n", "Epoch 3380/10000, Training Loss: 0.05393185839056969, Test Loss: 0.0620402954518795\n", "Epoch 3381/10000, Training Loss: 0.05392112210392952, Test Loss: 0.062040138989686966\n", "Epoch 3382/10000, Training Loss: 0.05391038954257965, Test Loss: 0.06203923374414444\n", "Epoch 3383/10000, Training Loss: 0.05389963090419769, Test Loss: 0.06203782930970192\n", "Epoch 3384/10000, Training Loss: 0.05388887971639633, Test Loss: 0.06203638017177582\n", "Epoch 3385/10000, Training Loss: 0.05387812852859497, Test Loss: 0.06203557923436165\n", "Epoch 3386/10000, Training Loss: 0.053867436945438385, Test Loss: 0.062035415321588516\n", "Epoch 3387/10000, Training Loss: 0.05385671555995941, Test Loss: 0.06203553453087807\n", "Epoch 3388/10000, Training Loss: 0.05384599044919014, Test Loss: 0.06203574314713478\n", "Epoch 3389/10000, Training Loss: 0.05383528396487236, Test Loss: 0.06203578785061836\n", "Epoch 3390/10000, Training Loss: 0.05382455885410309, Test Loss: 0.062035128474235535\n", "Epoch 3391/10000, Training Loss: 0.05381381884217262, Test Loss: 0.062033798545598984\n", "Epoch 3392/10000, Training Loss: 0.053803130984306335, Test Loss: 0.062032103538513184\n", "Epoch 3393/10000, Training Loss: 0.05379238724708557, Test Loss: 0.062030963599681854\n", "Epoch 3394/10000, Training Loss: 0.05378169193863869, Test Loss: 0.06203044205904007\n", "Epoch 3395/10000, Training Loss: 0.053771018981933594, Test Loss: 0.0620303712785244\n", "Epoch 3396/10000, Training Loss: 0.05376030504703522, Test Loss: 0.06203049421310425\n", "Epoch 3397/10000, Training Loss: 0.05374959483742714, Test Loss: 0.06203025206923485\n", "Epoch 3398/10000, Training Loss: 0.05373891070485115, Test Loss: 0.06202998384833336\n", "Epoch 3399/10000, Training Loss: 0.05372823774814606, Test Loss: 0.06202954053878784\n", "Epoch 3400/10000, Training Loss: 0.05371749773621559, Test Loss: 0.062028974294662476\n", "Epoch 3401/10000, Training Loss: 0.05370679125189781, Test Loss: 0.062028076499700546\n", "Epoch 3402/10000, Training Loss: 0.0536961629986763, Test Loss: 0.0620267279446125\n", "Epoch 3403/10000, Training Loss: 0.05368543788790703, Test Loss: 0.06202548369765282\n", "Epoch 3404/10000, Training Loss: 0.05367477983236313, Test Loss: 0.06202463060617447\n", "Epoch 3405/10000, Training Loss: 0.05366409197449684, Test Loss: 0.06202445924282074\n", "Epoch 3406/10000, Training Loss: 0.053653448820114136, Test Loss: 0.06202477589249611\n", "Epoch 3407/10000, Training Loss: 0.053642768412828445, Test Loss: 0.06202521547675133\n", "Epoch 3408/10000, Training Loss: 0.05363212525844574, Test Loss: 0.062025364488363266\n", "Epoch 3409/10000, Training Loss: 0.053621433675289154, Test Loss: 0.0620248019695282\n", "Epoch 3410/10000, Training Loss: 0.05361075699329376, Test Loss: 0.06202348694205284\n", "Epoch 3411/10000, Training Loss: 0.053600095212459564, Test Loss: 0.06202193722128868\n", "Epoch 3412/10000, Training Loss: 0.053589459508657455, Test Loss: 0.062020834535360336\n", "Epoch 3413/10000, Training Loss: 0.053578782826662064, Test Loss: 0.06202027574181557\n", "Epoch 3414/10000, Training Loss: 0.053568121045827866, Test Loss: 0.06202038377523422\n", "Epoch 3415/10000, Training Loss: 0.05355745926499367, Test Loss: 0.062020715326070786\n", "Epoch 3416/10000, Training Loss: 0.05354681611061096, Test Loss: 0.06202095001935959\n", "Epoch 3417/10000, Training Loss: 0.05353614315390587, Test Loss: 0.062020543962717056\n", "Epoch 3418/10000, Training Loss: 0.05352551117539406, Test Loss: 0.06201957166194916\n", "Epoch 3419/10000, Training Loss: 0.053514882922172546, Test Loss: 0.06201821565628052\n", "Epoch 3420/10000, Training Loss: 0.05350426957011223, Test Loss: 0.062016863375902176\n", "Epoch 3421/10000, Training Loss: 0.053493596613407135, Test Loss: 0.06201602891087532\n", "Epoch 3422/10000, Training Loss: 0.05348297208547592, Test Loss: 0.06201589107513428\n", "Epoch 3423/10000, Training Loss: 0.053472358733415604, Test Loss: 0.062016431242227554\n", "Epoch 3424/10000, Training Loss: 0.0534617118537426, Test Loss: 0.06201697513461113\n", "Epoch 3425/10000, Training Loss: 0.05345109850168228, Test Loss: 0.06201726943254471\n", "Epoch 3426/10000, Training Loss: 0.05344047397375107, Test Loss: 0.062016818672418594\n", "Epoch 3427/10000, Training Loss: 0.05342983081936836, Test Loss: 0.0620155856013298\n", "Epoch 3428/10000, Training Loss: 0.053419239819049835, Test Loss: 0.06201393902301788\n", "Epoch 3429/10000, Training Loss: 0.053408604115247726, Test Loss: 0.0620124489068985\n", "Epoch 3430/10000, Training Loss: 0.05339797958731651, Test Loss: 0.0620114840567112\n", "Epoch 3431/10000, Training Loss: 0.053387392312288284, Test Loss: 0.062011368572711945\n", "Epoch 3432/10000, Training Loss: 0.053376782685518265, Test Loss: 0.06201193481683731\n", "Epoch 3433/10000, Training Loss: 0.053366146981716156, Test Loss: 0.06201276183128357\n", "Epoch 3434/10000, Training Loss: 0.05335557460784912, Test Loss: 0.062013331800699234\n", "Epoch 3435/10000, Training Loss: 0.05334495007991791, Test Loss: 0.06201330944895744\n", "Epoch 3436/10000, Training Loss: 0.05333433672785759, Test Loss: 0.06201222911477089\n", "Epoch 3437/10000, Training Loss: 0.05332374945282936, Test Loss: 0.06201053038239479\n", "Epoch 3438/10000, Training Loss: 0.05331316217780113, Test Loss: 0.06200869381427765\n", "Epoch 3439/10000, Training Loss: 0.0533025786280632, Test Loss: 0.062007445842027664\n", "Epoch 3440/10000, Training Loss: 0.05329195410013199, Test Loss: 0.06200728565454483\n", "Epoch 3441/10000, Training Loss: 0.05328136309981346, Test Loss: 0.06200789660215378\n", "Epoch 3442/10000, Training Loss: 0.05327078327536583, Test Loss: 0.06200893968343735\n", "Epoch 3443/10000, Training Loss: 0.053260210901498795, Test Loss: 0.062009405344724655\n", "Epoch 3444/10000, Training Loss: 0.05324963107705116, Test Loss: 0.06200896576046944\n", "Epoch 3445/10000, Training Loss: 0.05323905497789383, Test Loss: 0.06200791522860527\n", "Epoch 3446/10000, Training Loss: 0.05322849377989769, Test Loss: 0.062006689608097076\n", "Epoch 3447/10000, Training Loss: 0.05321790650486946, Test Loss: 0.062005627900362015\n", "Epoch 3448/10000, Training Loss: 0.053207334131002426, Test Loss: 0.06200505048036575\n", "Epoch 3449/10000, Training Loss: 0.05319676548242569, Test Loss: 0.06200486421585083\n", "Epoch 3450/10000, Training Loss: 0.05318622291088104, Test Loss: 0.06200487166643143\n", "Epoch 3451/10000, Training Loss: 0.053175654262304306, Test Loss: 0.06200483441352844\n", "Epoch 3452/10000, Training Loss: 0.053165093064308167, Test Loss: 0.06200466305017471\n", "Epoch 3453/10000, Training Loss: 0.05315449833869934, Test Loss: 0.06200455501675606\n", "Epoch 3454/10000, Training Loss: 0.05314395949244499, Test Loss: 0.06200430542230606\n", "Epoch 3455/10000, Training Loss: 0.05313344672322273, Test Loss: 0.06200404465198517\n", "Epoch 3456/10000, Training Loss: 0.053122904151678085, Test Loss: 0.06200363114476204\n", "Epoch 3457/10000, Training Loss: 0.05311233177781105, Test Loss: 0.06200283020734787\n", "Epoch 3458/10000, Training Loss: 0.05310177057981491, Test Loss: 0.0620017871260643\n", "Epoch 3459/10000, Training Loss: 0.05309121683239937, Test Loss: 0.062000785022974014\n", "Epoch 3460/10000, Training Loss: 0.053080689162015915, Test Loss: 0.06200047954916954\n", "Epoch 3461/10000, Training Loss: 0.05307011678814888, Test Loss: 0.06200075149536133\n", "Epoch 3462/10000, Training Loss: 0.05305960401892662, Test Loss: 0.06200139969587326\n", "Epoch 3463/10000, Training Loss: 0.05304912105202675, Test Loss: 0.06200186535716057\n", "Epoch 3464/10000, Training Loss: 0.05303855612874031, Test Loss: 0.06200159713625908\n", "Epoch 3465/10000, Training Loss: 0.05302805081009865, Test Loss: 0.062000591307878494\n", "Epoch 3466/10000, Training Loss: 0.053017519414424896, Test Loss: 0.0619993582367897\n", "Epoch 3467/10000, Training Loss: 0.053006965667009354, Test Loss: 0.061998400837183\n", "Epoch 3468/10000, Training Loss: 0.05299646779894829, Test Loss: 0.061997879296541214\n", "Epoch 3469/10000, Training Loss: 0.052985940128564835, Test Loss: 0.06199803203344345\n", "Epoch 3470/10000, Training Loss: 0.05297542363405228, Test Loss: 0.06199852004647255\n", "Epoch 3471/10000, Training Loss: 0.05296490341424942, Test Loss: 0.06199869140982628\n", "Epoch 3472/10000, Training Loss: 0.052954401820898056, Test Loss: 0.06199849396944046\n", "Epoch 3473/10000, Training Loss: 0.052943866699934006, Test Loss: 0.0619979090988636\n", "Epoch 3474/10000, Training Loss: 0.05293339490890503, Test Loss: 0.06199733167886734\n", "Epoch 3475/10000, Training Loss: 0.05292288213968277, Test Loss: 0.06199679523706436\n", "Epoch 3476/10000, Training Loss: 0.0529123954474926, Test Loss: 0.06199651211500168\n", "Epoch 3477/10000, Training Loss: 0.05290185287594795, Test Loss: 0.06199631467461586\n", "Epoch 3478/10000, Training Loss: 0.05289137363433838, Test Loss: 0.061996039003133774\n", "Epoch 3479/10000, Training Loss: 0.052880898118019104, Test Loss: 0.06199559569358826\n", "Epoch 3480/10000, Training Loss: 0.05287039279937744, Test Loss: 0.06199503690004349\n", "Epoch 3481/10000, Training Loss: 0.05285988748073578, Test Loss: 0.06199480593204498\n", "Epoch 3482/10000, Training Loss: 0.05284938961267471, Test Loss: 0.06199482083320618\n", "Epoch 3483/10000, Training Loss: 0.052838921546936035, Test Loss: 0.06199508532881737\n", "Epoch 3484/10000, Training Loss: 0.05282843858003616, Test Loss: 0.061994992196559906\n", "Epoch 3485/10000, Training Loss: 0.05281795933842659, Test Loss: 0.06199447065591812\n", "Epoch 3486/10000, Training Loss: 0.052807461470365524, Test Loss: 0.061994004994630814\n", "Epoch 3487/10000, Training Loss: 0.05279702693223953, Test Loss: 0.06199362501502037\n", "Epoch 3488/10000, Training Loss: 0.05278655141592026, Test Loss: 0.06199350953102112\n", "Epoch 3489/10000, Training Loss: 0.052776068449020386, Test Loss: 0.06199317425489426\n", "Epoch 3490/10000, Training Loss: 0.05276559665799141, Test Loss: 0.06199265271425247\n", "Epoch 3491/10000, Training Loss: 0.052755098789930344, Test Loss: 0.06199199706315994\n", "Epoch 3492/10000, Training Loss: 0.05274464935064316, Test Loss: 0.06199193373322487\n", "Epoch 3493/10000, Training Loss: 0.052734192460775375, Test Loss: 0.06199213117361069\n", "Epoch 3494/10000, Training Loss: 0.05272374674677849, Test Loss: 0.06199252977967262\n", "Epoch 3495/10000, Training Loss: 0.052713263779878616, Test Loss: 0.06199279800057411\n", "Epoch 3496/10000, Training Loss: 0.052702806890010834, Test Loss: 0.06199246644973755\n", "Epoch 3497/10000, Training Loss: 0.05269235745072365, Test Loss: 0.06199165806174278\n", "Epoch 3498/10000, Training Loss: 0.05268194153904915, Test Loss: 0.06199098005890846\n", "Epoch 3499/10000, Training Loss: 0.052671462297439575, Test Loss: 0.06199028715491295\n", "Epoch 3500/10000, Training Loss: 0.05266103893518448, Test Loss: 0.061990052461624146\n", "Epoch 3501/10000, Training Loss: 0.05265054106712341, Test Loss: 0.06199011579155922\n", "Epoch 3502/10000, Training Loss: 0.0526401549577713, Test Loss: 0.0619901642203331\n", "Epoch 3503/10000, Training Loss: 0.05262967199087143, Test Loss: 0.06199019029736519\n", "Epoch 3504/10000, Training Loss: 0.052619241178035736, Test Loss: 0.06199030578136444\n", "Epoch 3505/10000, Training Loss: 0.05260884389281273, Test Loss: 0.061990272253751755\n", "Epoch 3506/10000, Training Loss: 0.05259837955236435, Test Loss: 0.06199022755026817\n", "Epoch 3507/10000, Training Loss: 0.05258796736598015, Test Loss: 0.06198979169130325\n", "Epoch 3508/10000, Training Loss: 0.05257750675082207, Test Loss: 0.061989083886146545\n", "Epoch 3509/10000, Training Loss: 0.05256708711385727, Test Loss: 0.06198831647634506\n", "Epoch 3510/10000, Training Loss: 0.05255668982863426, Test Loss: 0.06198814511299133\n", "Epoch 3511/10000, Training Loss: 0.05254626274108887, Test Loss: 0.06198831647634506\n", "Epoch 3512/10000, Training Loss: 0.05253583565354347, Test Loss: 0.061988651752471924\n", "Epoch 3513/10000, Training Loss: 0.05252540856599808, Test Loss: 0.06198914721608162\n", "Epoch 3514/10000, Training Loss: 0.05251501128077507, Test Loss: 0.06198908016085625\n", "Epoch 3515/10000, Training Loss: 0.05250459536910057, Test Loss: 0.0619884729385376\n", "Epoch 3516/10000, Training Loss: 0.05249420925974846, Test Loss: 0.061987679451704025\n", "Epoch 3517/10000, Training Loss: 0.05248380824923515, Test Loss: 0.06198717653751373\n", "Epoch 3518/10000, Training Loss: 0.05247338488698006, Test Loss: 0.06198710575699806\n", "Epoch 3519/10000, Training Loss: 0.05246298015117645, Test Loss: 0.06198735535144806\n", "Epoch 3520/10000, Training Loss: 0.05245256796479225, Test Loss: 0.061987824738025665\n", "Epoch 3521/10000, Training Loss: 0.052442170679569244, Test Loss: 0.06198788806796074\n", "Epoch 3522/10000, Training Loss: 0.05243179574608803, Test Loss: 0.06198759749531746\n", "Epoch 3523/10000, Training Loss: 0.05242140218615532, Test Loss: 0.06198688969016075\n", "Epoch 3524/10000, Training Loss: 0.05241100862622261, Test Loss: 0.06198606640100479\n", "Epoch 3525/10000, Training Loss: 0.052400629967451096, Test Loss: 0.061985746026039124\n", "Epoch 3526/10000, Training Loss: 0.05239022150635719, Test Loss: 0.061986058950424194\n", "Epoch 3527/10000, Training Loss: 0.052379850298166275, Test Loss: 0.06198672577738762\n", "Epoch 3528/10000, Training Loss: 0.052369456738233566, Test Loss: 0.06198735535144806\n", "Epoch 3529/10000, Training Loss: 0.05235906317830086, Test Loss: 0.061987634748220444\n", "Epoch 3530/10000, Training Loss: 0.05234871059656143, Test Loss: 0.06198717653751373\n", "Epoch 3531/10000, Training Loss: 0.052338335663080215, Test Loss: 0.06198625639081001\n", "Epoch 3532/10000, Training Loss: 0.0523279644548893, Test Loss: 0.061985258013010025\n", "Epoch 3533/10000, Training Loss: 0.05231756716966629, Test Loss: 0.06198446452617645\n", "Epoch 3534/10000, Training Loss: 0.052307188510894775, Test Loss: 0.061984412372112274\n", "Epoch 3535/10000, Training Loss: 0.052296847105026245, Test Loss: 0.06198509782552719\n", "Epoch 3536/10000, Training Loss: 0.05228649079799652, Test Loss: 0.06198615953326225\n", "Epoch 3537/10000, Training Loss: 0.0522761233150959, Test Loss: 0.06198694929480553\n", "Epoch 3538/10000, Training Loss: 0.052265752106904984, Test Loss: 0.06198691576719284\n", "Epoch 3539/10000, Training Loss: 0.052255403250455856, Test Loss: 0.06198619306087494\n", "Epoch 3540/10000, Training Loss: 0.052245039492845535, Test Loss: 0.06198509782552719\n", "Epoch 3541/10000, Training Loss: 0.0522347092628479, Test Loss: 0.06198432296514511\n", "Epoch 3542/10000, Training Loss: 0.05222431570291519, Test Loss: 0.06198420748114586\n", "Epoch 3543/10000, Training Loss: 0.05221397802233696, Test Loss: 0.06198464706540108\n", "Epoch 3544/10000, Training Loss: 0.05220362916588783, Test Loss: 0.061984989792108536\n", "Epoch 3545/10000, Training Loss: 0.0521932989358902, Test Loss: 0.0619853213429451\n", "Epoch 3546/10000, Training Loss: 0.05218295753002167, Test Loss: 0.061985671520233154\n", "Epoch 3547/10000, Training Loss: 0.05217264965176582, Test Loss: 0.06198575347661972\n", "Epoch 3548/10000, Training Loss: 0.05216224864125252, Test Loss: 0.06198542192578316\n", "Epoch 3549/10000, Training Loss: 0.052151963114738464, Test Loss: 0.06198478862643242\n", "Epoch 3550/10000, Training Loss: 0.05214163288474083, Test Loss: 0.06198439374566078\n", "Epoch 3551/10000, Training Loss: 0.0521312952041626, Test Loss: 0.06198437884449959\n", "Epoch 3552/10000, Training Loss: 0.052120938897132874, Test Loss: 0.06198472902178764\n", "Epoch 3553/10000, Training Loss: 0.05211063474416733, Test Loss: 0.06198496371507645\n", "Epoch 3554/10000, Training Loss: 0.0521002858877182, Test Loss: 0.06198495253920555\n", "Epoch 3555/10000, Training Loss: 0.05208997428417206, Test Loss: 0.061984673142433167\n", "Epoch 3556/10000, Training Loss: 0.05207963287830353, Test Loss: 0.061984531581401825\n", "Epoch 3557/10000, Training Loss: 0.05206936597824097, Test Loss: 0.061984699219465256\n", "Epoch 3558/10000, Training Loss: 0.05205904319882393, Test Loss: 0.061985015869140625\n", "Epoch 3559/10000, Training Loss: 0.0520487017929554, Test Loss: 0.06198498234152794\n", "Epoch 3560/10000, Training Loss: 0.05203840881586075, Test Loss: 0.06198480725288391\n", "Epoch 3561/10000, Training Loss: 0.05202808976173401, Test Loss: 0.061984483152627945\n", "Epoch 3562/10000, Training Loss: 0.052017781883478165, Test Loss: 0.06198446452617645\n", "Epoch 3563/10000, Training Loss: 0.05200750380754471, Test Loss: 0.06198465824127197\n", "Epoch 3564/10000, Training Loss: 0.051997169852256775, Test Loss: 0.06198505312204361\n", "Epoch 3565/10000, Training Loss: 0.051986873149871826, Test Loss: 0.06198546662926674\n", "Epoch 3566/10000, Training Loss: 0.05197657272219658, Test Loss: 0.061985310167074203\n", "Epoch 3567/10000, Training Loss: 0.051966287195682526, Test Loss: 0.061984844505786896\n", "Epoch 3568/10000, Training Loss: 0.05195595324039459, Test Loss: 0.061984315514564514\n", "Epoch 3569/10000, Training Loss: 0.05194570869207382, Test Loss: 0.06198401004076004\n", "Epoch 3570/10000, Training Loss: 0.051935382187366486, Test Loss: 0.06198430433869362\n", "Epoch 3571/10000, Training Loss: 0.05192512646317482, Test Loss: 0.06198495998978615\n", "Epoch 3572/10000, Training Loss: 0.05191482976078987, Test Loss: 0.06198572739958763\n", "Epoch 3573/10000, Training Loss: 0.05190454423427582, Test Loss: 0.06198615953326225\n", "Epoch 3574/10000, Training Loss: 0.05189428851008415, Test Loss: 0.061986129730939865\n", "Epoch 3575/10000, Training Loss: 0.05188397318124771, Test Loss: 0.06198548153042793\n", "Epoch 3576/10000, Training Loss: 0.05187369883060455, Test Loss: 0.06198465824127197\n", "Epoch 3577/10000, Training Loss: 0.0518634170293808, Test Loss: 0.06198423355817795\n", "Epoch 3578/10000, Training Loss: 0.05185313522815704, Test Loss: 0.061984505504369736\n", "Epoch 3579/10000, Training Loss: 0.051842931658029556, Test Loss: 0.06198528781533241\n", "Epoch 3580/10000, Training Loss: 0.05183262377977371, Test Loss: 0.06198596954345703\n", "Epoch 3581/10000, Training Loss: 0.05182235315442085, Test Loss: 0.06198631972074509\n", "Epoch 3582/10000, Training Loss: 0.05181211605668068, Test Loss: 0.061986204236745834\n", "Epoch 3583/10000, Training Loss: 0.05180182307958603, Test Loss: 0.06198598071932793\n", "Epoch 3584/10000, Training Loss: 0.05179160460829735, Test Loss: 0.06198583543300629\n", "Epoch 3585/10000, Training Loss: 0.05178132280707359, Test Loss: 0.06198594346642494\n", "Epoch 3586/10000, Training Loss: 0.05177105963230133, Test Loss: 0.061985958367586136\n", "Epoch 3587/10000, Training Loss: 0.051760800182819366, Test Loss: 0.06198588013648987\n", "Epoch 3588/10000, Training Loss: 0.051750548183918, Test Loss: 0.06198595091700554\n", "Epoch 3589/10000, Training Loss: 0.05174031853675842, Test Loss: 0.06198624148964882\n", "Epoch 3590/10000, Training Loss: 0.051730070263147354, Test Loss: 0.0619865283370018\n", "Epoch 3591/10000, Training Loss: 0.05171982944011688, Test Loss: 0.061986688524484634\n", "Epoch 3592/10000, Training Loss: 0.051709577441215515, Test Loss: 0.061986662447452545\n", "Epoch 3593/10000, Training Loss: 0.05169934406876564, Test Loss: 0.06198675185441971\n", "Epoch 3594/10000, Training Loss: 0.05168911814689636, Test Loss: 0.061987102031707764\n", "Epoch 3595/10000, Training Loss: 0.051678869873285294, Test Loss: 0.06198748201131821\n", "Epoch 3596/10000, Training Loss: 0.05166863277554512, Test Loss: 0.06198781356215477\n", "Epoch 3597/10000, Training Loss: 0.05165841057896614, Test Loss: 0.06198778748512268\n", "Epoch 3598/10000, Training Loss: 0.051648177206516266, Test Loss: 0.061987411230802536\n", "Epoch 3599/10000, Training Loss: 0.05163795128464699, Test Loss: 0.061986878514289856\n", "Epoch 3600/10000, Training Loss: 0.051627736538648605, Test Loss: 0.06198658049106598\n", "Epoch 3601/10000, Training Loss: 0.05161750316619873, Test Loss: 0.06198705732822418\n", "Epoch 3602/10000, Training Loss: 0.05160727724432945, Test Loss: 0.061988022178411484\n", "Epoch 3603/10000, Training Loss: 0.05159704014658928, Test Loss: 0.06198909133672714\n", "Epoch 3604/10000, Training Loss: 0.05158686637878418, Test Loss: 0.06198950484395027\n", "Epoch 3605/10000, Training Loss: 0.051576633006334305, Test Loss: 0.061989299952983856\n", "Epoch 3606/10000, Training Loss: 0.05156642571091652, Test Loss: 0.06198899447917938\n", "Epoch 3607/10000, Training Loss: 0.051556218415498734, Test Loss: 0.061988793313503265\n", "Epoch 3608/10000, Training Loss: 0.05154599994421005, Test Loss: 0.06198845058679581\n", "Epoch 3609/10000, Training Loss: 0.051535800099372864, Test Loss: 0.06198834627866745\n", "Epoch 3610/10000, Training Loss: 0.051525577902793884, Test Loss: 0.061988670378923416\n", "Epoch 3611/10000, Training Loss: 0.05151538923382759, Test Loss: 0.06198945268988609\n", "Epoch 3612/10000, Training Loss: 0.05150517448782921, Test Loss: 0.06199008226394653\n", "Epoch 3613/10000, Training Loss: 0.051494985818862915, Test Loss: 0.061990439891815186\n", "Epoch 3614/10000, Training Loss: 0.05148479342460632, Test Loss: 0.06199077144265175\n", "Epoch 3615/10000, Training Loss: 0.05147458240389824, Test Loss: 0.06199050322175026\n", "Epoch 3616/10000, Training Loss: 0.05146441608667374, Test Loss: 0.06199060380458832\n", "Epoch 3617/10000, Training Loss: 0.05145421624183655, Test Loss: 0.061990801244974136\n", "Epoch 3618/10000, Training Loss: 0.05144402012228966, Test Loss: 0.06199079006910324\n", "Epoch 3619/10000, Training Loss: 0.051433857530355453, Test Loss: 0.061990801244974136\n", "Epoch 3620/10000, Training Loss: 0.051423653960227966, Test Loss: 0.06199115142226219\n", "Epoch 3621/10000, Training Loss: 0.05141346529126167, Test Loss: 0.061991311609745026\n", "Epoch 3622/10000, Training Loss: 0.05140327662229538, Test Loss: 0.0619918517768383\n", "Epoch 3623/10000, Training Loss: 0.05139312520623207, Test Loss: 0.061992429196834564\n", "Epoch 3624/10000, Training Loss: 0.05138290300965309, Test Loss: 0.06199268996715546\n", "Epoch 3625/10000, Training Loss: 0.05137275904417038, Test Loss: 0.06199270114302635\n", "Epoch 3626/10000, Training Loss: 0.051362596452236176, Test Loss: 0.061992548406124115\n", "Epoch 3627/10000, Training Loss: 0.05135243013501167, Test Loss: 0.061992719769477844\n", "Epoch 3628/10000, Training Loss: 0.05134228616952896, Test Loss: 0.061993058770895004\n", "Epoch 3629/10000, Training Loss: 0.05133207142353058, Test Loss: 0.06199366971850395\n", "Epoch 3630/10000, Training Loss: 0.05132192000746727, Test Loss: 0.06199405714869499\n", "Epoch 3631/10000, Training Loss: 0.05131177231669426, Test Loss: 0.061994221061468124\n", "Epoch 3632/10000, Training Loss: 0.051301635801792145, Test Loss: 0.061994075775146484\n", "Epoch 3633/10000, Training Loss: 0.05129147693514824, Test Loss: 0.06199418380856514\n", "Epoch 3634/10000, Training Loss: 0.05128130689263344, Test Loss: 0.06199458986520767\n", "Epoch 3635/10000, Training Loss: 0.051271166652441025, Test Loss: 0.06199518218636513\n", "Epoch 3636/10000, Training Loss: 0.051260992884635925, Test Loss: 0.061995625495910645\n", "Epoch 3637/10000, Training Loss: 0.05125084146857262, Test Loss: 0.061995819211006165\n", "Epoch 3638/10000, Training Loss: 0.0512407124042511, Test Loss: 0.06199584901332855\n", "Epoch 3639/10000, Training Loss: 0.051230549812316895, Test Loss: 0.061995990574359894\n", "Epoch 3640/10000, Training Loss: 0.05122041329741478, Test Loss: 0.061996378004550934\n", "Epoch 3641/10000, Training Loss: 0.05121028423309326, Test Loss: 0.06199668347835541\n", "Epoch 3642/10000, Training Loss: 0.05120011046528816, Test Loss: 0.061996955424547195\n", "Epoch 3643/10000, Training Loss: 0.051190052181482315, Test Loss: 0.06199726089835167\n", "Epoch 3644/10000, Training Loss: 0.05117987096309662, Test Loss: 0.0619976632297039\n", "Epoch 3645/10000, Training Loss: 0.051169753074645996, Test Loss: 0.06199796125292778\n", "Epoch 3646/10000, Training Loss: 0.051159631460905075, Test Loss: 0.06199810653924942\n", "Epoch 3647/10000, Training Loss: 0.05114948749542236, Test Loss: 0.06199853867292404\n", "Epoch 3648/10000, Training Loss: 0.05113936588168144, Test Loss: 0.06199886277318001\n", "Epoch 3649/10000, Training Loss: 0.05112924054265022, Test Loss: 0.061999354511499405\n", "Epoch 3650/10000, Training Loss: 0.05111910030245781, Test Loss: 0.061999861150979996\n", "Epoch 3651/10000, Training Loss: 0.051108989864587784, Test Loss: 0.06200011074542999\n", "Epoch 3652/10000, Training Loss: 0.051098860800266266, Test Loss: 0.06200011074542999\n", "Epoch 3653/10000, Training Loss: 0.05108872056007385, Test Loss: 0.06199993938207626\n", "Epoch 3654/10000, Training Loss: 0.051078613847494125, Test Loss: 0.062000300735235214\n", "Epoch 3655/10000, Training Loss: 0.0510685071349144, Test Loss: 0.06200096011161804\n", "Epoch 3656/10000, Training Loss: 0.05105840042233467, Test Loss: 0.06200188398361206\n", "Epoch 3657/10000, Training Loss: 0.051048312336206436, Test Loss: 0.0620025172829628\n", "Epoch 3658/10000, Training Loss: 0.051038216799497604, Test Loss: 0.06200268492102623\n", "Epoch 3659/10000, Training Loss: 0.05102808400988579, Test Loss: 0.06200255826115608\n", "Epoch 3660/10000, Training Loss: 0.051017966121435165, Test Loss: 0.06200234591960907\n", "Epoch 3661/10000, Training Loss: 0.05100791156291962, Test Loss: 0.06200259551405907\n", "Epoch 3662/10000, Training Loss: 0.050997789949178696, Test Loss: 0.06200338900089264\n", "Epoch 3663/10000, Training Loss: 0.05098770931363106, Test Loss: 0.06200440600514412\n", "Epoch 3664/10000, Training Loss: 0.05097760632634163, Test Loss: 0.06200527027249336\n", "Epoch 3665/10000, Training Loss: 0.0509675107896328, Test Loss: 0.062005508691072464\n", "Epoch 3666/10000, Training Loss: 0.050957418978214264, Test Loss: 0.06200525909662247\n", "Epoch 3667/10000, Training Loss: 0.05094736069440842, Test Loss: 0.062004946172237396\n", "Epoch 3668/10000, Training Loss: 0.05093727633357048, Test Loss: 0.06200483441352844\n", "Epoch 3669/10000, Training Loss: 0.05092718079686165, Test Loss: 0.06200519576668739\n", "Epoch 3670/10000, Training Loss: 0.05091710761189461, Test Loss: 0.06200617924332619\n", "Epoch 3671/10000, Training Loss: 0.05090704932808876, Test Loss: 0.06200756877660751\n", "Epoch 3672/10000, Training Loss: 0.05089695751667023, Test Loss: 0.06200876086950302\n", "Epoch 3673/10000, Training Loss: 0.0508868582546711, Test Loss: 0.062009479850530624\n", "Epoch 3674/10000, Training Loss: 0.05087677016854286, Test Loss: 0.06200934201478958\n", "Epoch 3675/10000, Training Loss: 0.050866689532995224, Test Loss: 0.06200861558318138\n", "Epoch 3676/10000, Training Loss: 0.05085664987564087, Test Loss: 0.06200789660215378\n", "Epoch 3677/10000, Training Loss: 0.05084657669067383, Test Loss: 0.06200782209634781\n", "Epoch 3678/10000, Training Loss: 0.050836529582738876, Test Loss: 0.062008731067180634\n", "Epoch 3679/10000, Training Loss: 0.050826434046030045, Test Loss: 0.062010377645492554\n", "Epoch 3680/10000, Training Loss: 0.050816405564546585, Test Loss: 0.06201205402612686\n", "Epoch 3681/10000, Training Loss: 0.050806306302547455, Test Loss: 0.06201287731528282\n", "Epoch 3682/10000, Training Loss: 0.05079628527164459, Test Loss: 0.0620129257440567\n", "Epoch 3683/10000, Training Loss: 0.050786200910806656, Test Loss: 0.06201224774122238\n", "Epoch 3684/10000, Training Loss: 0.05077617987990379, Test Loss: 0.06201152130961418\n", "Epoch 3685/10000, Training Loss: 0.05076611042022705, Test Loss: 0.062011655420064926\n", "Epoch 3686/10000, Training Loss: 0.050756048411130905, Test Loss: 0.06201251968741417\n", "Epoch 3687/10000, Training Loss: 0.050746019929647446, Test Loss: 0.062013912945985794\n", "Epoch 3688/10000, Training Loss: 0.050735954195261, Test Loss: 0.0620151087641716\n", "Epoch 3689/10000, Training Loss: 0.05072590336203575, Test Loss: 0.0620158389210701\n", "Epoch 3690/10000, Training Loss: 0.050715889781713486, Test Loss: 0.06201598420739174\n", "Epoch 3691/10000, Training Loss: 0.050705794245004654, Test Loss: 0.062016043812036514\n", "Epoch 3692/10000, Training Loss: 0.050695810467004776, Test Loss: 0.06201622635126114\n", "Epoch 3693/10000, Training Loss: 0.050685763359069824, Test Loss: 0.06201634556055069\n", "Epoch 3694/10000, Training Loss: 0.05067572370171547, Test Loss: 0.06201663985848427\n", "Epoch 3695/10000, Training Loss: 0.05066570267081261, Test Loss: 0.06201735883951187\n", "Epoch 3696/10000, Training Loss: 0.05065568536520004, Test Loss: 0.062018293887376785\n", "Epoch 3697/10000, Training Loss: 0.050645653158426285, Test Loss: 0.06201911345124245\n", "Epoch 3698/10000, Training Loss: 0.05063560977578163, Test Loss: 0.06201949343085289\n", "Epoch 3699/10000, Training Loss: 0.05062558501958847, Test Loss: 0.062019724398851395\n", "Epoch 3700/10000, Training Loss: 0.050615567713975906, Test Loss: 0.06201985850930214\n", "Epoch 3701/10000, Training Loss: 0.05060552433133125, Test Loss: 0.06202033534646034\n", "Epoch 3702/10000, Training Loss: 0.05059553310275078, Test Loss: 0.06202119216322899\n", "Epoch 3703/10000, Training Loss: 0.05058549717068672, Test Loss: 0.062022119760513306\n", "Epoch 3704/10000, Training Loss: 0.05057545006275177, Test Loss: 0.062023043632507324\n", "Epoch 3705/10000, Training Loss: 0.050565484911203384, Test Loss: 0.06202344223856926\n", "Epoch 3706/10000, Training Loss: 0.050555519759655, Test Loss: 0.06202337145805359\n", "Epoch 3707/10000, Training Loss: 0.05054549127817154, Test Loss: 0.062023188918828964\n", "Epoch 3708/10000, Training Loss: 0.050535473972558975, Test Loss: 0.06202321499586105\n", "Epoch 3709/10000, Training Loss: 0.05052542686462402, Test Loss: 0.062023844569921494\n", "Epoch 3710/10000, Training Loss: 0.050515443086624146, Test Loss: 0.062025006860494614\n", "Epoch 3711/10000, Training Loss: 0.05050545930862427, Test Loss: 0.0620264895260334\n", "Epoch 3712/10000, Training Loss: 0.05049542710185051, Test Loss: 0.06202783063054085\n", "Epoch 3713/10000, Training Loss: 0.05048542842268944, Test Loss: 0.06202833354473114\n", "Epoch 3714/10000, Training Loss: 0.05047548562288284, Test Loss: 0.06202823668718338\n", "Epoch 3715/10000, Training Loss: 0.050465431064367294, Test Loss: 0.062027715146541595\n", "Epoch 3716/10000, Training Loss: 0.050455491989851, Test Loss: 0.06202748045325279\n", "Epoch 3717/10000, Training Loss: 0.05044548586010933, Test Loss: 0.06202813610434532\n", "Epoch 3718/10000, Training Loss: 0.05043547973036766, Test Loss: 0.06202944368124008\n", "Epoch 3719/10000, Training Loss: 0.050425514578819275, Test Loss: 0.062031034380197525\n", "Epoch 3720/10000, Training Loss: 0.05041554570198059, Test Loss: 0.06203208863735199\n", "Epoch 3721/10000, Training Loss: 0.050405535846948624, Test Loss: 0.062032558023929596\n", "Epoch 3722/10000, Training Loss: 0.05039553344249725, Test Loss: 0.062032558023929596\n", "Epoch 3723/10000, Training Loss: 0.05038559436798096, Test Loss: 0.06203232333064079\n", "Epoch 3724/10000, Training Loss: 0.05037559196352959, Test Loss: 0.06203266233205795\n", "Epoch 3725/10000, Training Loss: 0.05036561191082001, Test Loss: 0.06203344836831093\n", "Epoch 3726/10000, Training Loss: 0.050355665385723114, Test Loss: 0.062034722417593\n", "Epoch 3727/10000, Training Loss: 0.050345711410045624, Test Loss: 0.06203566864132881\n", "Epoch 3728/10000, Training Loss: 0.050335731357336044, Test Loss: 0.062036242336034775\n", "Epoch 3729/10000, Training Loss: 0.050325751304626465, Test Loss: 0.06203648820519447\n", "Epoch 3730/10000, Training Loss: 0.05031575262546539, Test Loss: 0.06203658878803253\n", "Epoch 3731/10000, Training Loss: 0.050305817276239395, Test Loss: 0.06203709915280342\n", "Epoch 3732/10000, Training Loss: 0.050295840948820114, Test Loss: 0.06203814968466759\n", "Epoch 3733/10000, Training Loss: 0.050285883247852325, Test Loss: 0.06203924119472504\n", "Epoch 3734/10000, Training Loss: 0.050275951623916626, Test Loss: 0.062040138989686966\n", "Epoch 3735/10000, Training Loss: 0.05026600882411003, Test Loss: 0.062040552496910095\n", "Epoch 3736/10000, Training Loss: 0.05025607347488403, Test Loss: 0.06204073131084442\n", "Epoch 3737/10000, Training Loss: 0.05024609714746475, Test Loss: 0.06204086169600487\n", "Epoch 3738/10000, Training Loss: 0.050236113369464874, Test Loss: 0.062041301280260086\n", "Epoch 3739/10000, Training Loss: 0.050226181745529175, Test Loss: 0.06204213947057724\n", "Epoch 3740/10000, Training Loss: 0.05021625757217407, Test Loss: 0.06204335018992424\n", "Epoch 3741/10000, Training Loss: 0.050206296145915985, Test Loss: 0.062044717371463776\n", "Epoch 3742/10000, Training Loss: 0.05019637569785118, Test Loss: 0.06204577907919884\n", "Epoch 3743/10000, Training Loss: 0.0501864030957222, Test Loss: 0.06204619258642197\n", "Epoch 3744/10000, Training Loss: 0.05017650127410889, Test Loss: 0.06204615905880928\n", "Epoch 3745/10000, Training Loss: 0.05016656965017319, Test Loss: 0.062045980244874954\n", "Epoch 3746/10000, Training Loss: 0.05015658587217331, Test Loss: 0.062046170234680176\n", "Epoch 3747/10000, Training Loss: 0.050146687775850296, Test Loss: 0.06204681470990181\n", "Epoch 3748/10000, Training Loss: 0.05013673007488251, Test Loss: 0.062047913670539856\n", "Epoch 3749/10000, Training Loss: 0.050126828253269196, Test Loss: 0.0620495043694973\n", "Epoch 3750/10000, Training Loss: 0.05011685937643051, Test Loss: 0.06205117702484131\n", "Epoch 3751/10000, Training Loss: 0.05010698735713959, Test Loss: 0.062052350491285324\n", "Epoch 3752/10000, Training Loss: 0.05009705573320389, Test Loss: 0.06205269321799278\n", "Epoch 3753/10000, Training Loss: 0.05008714273571968, Test Loss: 0.062052320688962936\n", "Epoch 3754/10000, Training Loss: 0.05007721111178398, Test Loss: 0.06205197796225548\n", "Epoch 3755/10000, Training Loss: 0.05006727576255798, Test Loss: 0.06205237656831741\n", "Epoch 3756/10000, Training Loss: 0.050057388842105865, Test Loss: 0.06205347552895546\n", "Epoch 3757/10000, Training Loss: 0.05004749074578285, Test Loss: 0.062054868787527084\n", "Epoch 3758/10000, Training Loss: 0.05003757402300835, Test Loss: 0.06205618008971214\n", "Epoch 3759/10000, Training Loss: 0.050027649849653244, Test Loss: 0.062057074159383774\n", "Epoch 3760/10000, Training Loss: 0.05001772940158844, Test Loss: 0.0620574951171875\n", "Epoch 3761/10000, Training Loss: 0.05000779405236244, Test Loss: 0.062057990580797195\n", "Epoch 3762/10000, Training Loss: 0.049997925758361816, Test Loss: 0.06205865740776062\n", "Epoch 3763/10000, Training Loss: 0.049988020211458206, Test Loss: 0.062059465795755386\n", "Epoch 3764/10000, Training Loss: 0.04997812211513519, Test Loss: 0.062060222029685974\n", "Epoch 3765/10000, Training Loss: 0.049968231469392776, Test Loss: 0.062060825526714325\n", "Epoch 3766/10000, Training Loss: 0.049958307296037674, Test Loss: 0.06206135451793671\n", "Epoch 3767/10000, Training Loss: 0.04994845762848854, Test Loss: 0.06206198409199715\n", "Epoch 3768/10000, Training Loss: 0.049938540905714035, Test Loss: 0.062062714248895645\n", "Epoch 3769/10000, Training Loss: 0.049928657710552216, Test Loss: 0.06206377595663071\n", "Epoch 3770/10000, Training Loss: 0.0499187670648098, Test Loss: 0.06206510588526726\n", "Epoch 3771/10000, Training Loss: 0.04990889132022858, Test Loss: 0.06206642836332321\n", "Epoch 3772/10000, Training Loss: 0.04989900067448616, Test Loss: 0.062067121267318726\n", "Epoch 3773/10000, Training Loss: 0.04988912120461464, Test Loss: 0.062067318707704544\n", "Epoch 3774/10000, Training Loss: 0.04987921565771103, Test Loss: 0.06206734851002693\n", "Epoch 3775/10000, Training Loss: 0.04986933618783951, Test Loss: 0.062067609280347824\n", "Epoch 3776/10000, Training Loss: 0.04985947161912918, Test Loss: 0.06206873431801796\n", "Epoch 3777/10000, Training Loss: 0.04984958469867706, Test Loss: 0.062070198357105255\n", "Epoch 3778/10000, Training Loss: 0.049839697778224945, Test Loss: 0.06207152083516121\n", "Epoch 3779/10000, Training Loss: 0.04982982948422432, Test Loss: 0.06207231059670448\n", "Epoch 3780/10000, Training Loss: 0.0498199462890625, Test Loss: 0.06207277253270149\n", "Epoch 3781/10000, Training Loss: 0.04981011152267456, Test Loss: 0.062073156237602234\n", "Epoch 3782/10000, Training Loss: 0.049800239503383636, Test Loss: 0.06207384914159775\n", "Epoch 3783/10000, Training Loss: 0.049790363758802414, Test Loss: 0.06207495927810669\n", "Epoch 3784/10000, Training Loss: 0.04978053271770477, Test Loss: 0.06207629665732384\n", "Epoch 3785/10000, Training Loss: 0.049770668148994446, Test Loss: 0.062077250331640244\n", "Epoch 3786/10000, Training Loss: 0.04976079985499382, Test Loss: 0.06207787245512009\n", "Epoch 3787/10000, Training Loss: 0.0497509241104126, Test Loss: 0.06207823380827904\n", "Epoch 3788/10000, Training Loss: 0.04974108934402466, Test Loss: 0.06207859143614769\n", "Epoch 3789/10000, Training Loss: 0.04973124712705612, Test Loss: 0.06207918003201485\n", "Epoch 3790/10000, Training Loss: 0.049721404910087585, Test Loss: 0.062080446630716324\n", "Epoch 3791/10000, Training Loss: 0.04971150681376457, Test Loss: 0.062082018703222275\n", "Epoch 3792/10000, Training Loss: 0.04970168694853783, Test Loss: 0.062083613127470016\n", "Epoch 3793/10000, Training Loss: 0.049691807478666306, Test Loss: 0.062084466218948364\n", "Epoch 3794/10000, Training Loss: 0.04968201369047165, Test Loss: 0.062084853649139404\n", "Epoch 3795/10000, Training Loss: 0.049672145396471024, Test Loss: 0.06208494305610657\n", "Epoch 3796/10000, Training Loss: 0.04966231808066368, Test Loss: 0.06208516284823418\n", "Epoch 3797/10000, Training Loss: 0.04965247958898544, Test Loss: 0.0620858371257782\n", "Epoch 3798/10000, Training Loss: 0.049642615020275116, Test Loss: 0.06208694353699684\n", "Epoch 3799/10000, Training Loss: 0.049632783979177475, Test Loss: 0.06208851560950279\n", "Epoch 3800/10000, Training Loss: 0.049622975289821625, Test Loss: 0.062090300023555756\n", "Epoch 3801/10000, Training Loss: 0.049613118171691895, Test Loss: 0.06209184601902962\n", "Epoch 3802/10000, Training Loss: 0.04960326477885246, Test Loss: 0.062092531472444534\n", "Epoch 3803/10000, Training Loss: 0.04959346726536751, Test Loss: 0.06209258362650871\n", "Epoch 3804/10000, Training Loss: 0.049583617597818375, Test Loss: 0.06209259107708931\n", "Epoch 3805/10000, Training Loss: 0.049573805183172226, Test Loss: 0.062092769891023636\n", "Epoch 3806/10000, Training Loss: 0.04956400766968727, Test Loss: 0.062093932181596756\n", "Epoch 3807/10000, Training Loss: 0.049554165452718735, Test Loss: 0.06209578365087509\n", "Epoch 3808/10000, Training Loss: 0.04954434186220169, Test Loss: 0.062097396701574326\n", "Epoch 3809/10000, Training Loss: 0.04953451827168465, Test Loss: 0.06209859251976013\n", "Epoch 3810/10000, Training Loss: 0.049524709582328796, Test Loss: 0.062099188566207886\n", "Epoch 3811/10000, Training Loss: 0.049514882266521454, Test Loss: 0.06209954619407654\n", "Epoch 3812/10000, Training Loss: 0.049505092203617096, Test Loss: 0.062100086361169815\n", "Epoch 3813/10000, Training Loss: 0.049495283514261246, Test Loss: 0.0621001310646534\n", "Epoch 3814/10000, Training Loss: 0.04948548227548599, Test Loss: 0.062099888920784\n", "Epoch 3815/10000, Training Loss: 0.049475688487291336, Test Loss: 0.06209995225071907\n", "Epoch 3816/10000, Training Loss: 0.049465883523225784, Test Loss: 0.0621008537709713\n", "Epoch 3817/10000, Training Loss: 0.04945606365799904, Test Loss: 0.06210219860076904\n", "Epoch 3818/10000, Training Loss: 0.049446288496255875, Test Loss: 0.06210428103804588\n", "Epoch 3819/10000, Training Loss: 0.049436457455158234, Test Loss: 0.06210605800151825\n", "Epoch 3820/10000, Training Loss: 0.04942669719457626, Test Loss: 0.062107283622026443\n", "Epoch 3821/10000, Training Loss: 0.04941689968109131, Test Loss: 0.062108006328344345\n", "Epoch 3822/10000, Training Loss: 0.04940709099173546, Test Loss: 0.06210821866989136\n", "Epoch 3823/10000, Training Loss: 0.049397312104701996, Test Loss: 0.062108222395181656\n", "Epoch 3824/10000, Training Loss: 0.04938756674528122, Test Loss: 0.062108658254146576\n", "Epoch 3825/10000, Training Loss: 0.04937775805592537, Test Loss: 0.06210966035723686\n", "Epoch 3826/10000, Training Loss: 0.04936797916889191, Test Loss: 0.06211097165942192\n", "Epoch 3827/10000, Training Loss: 0.04935820400714874, Test Loss: 0.06211250275373459\n", "Epoch 3828/10000, Training Loss: 0.04934845119714737, Test Loss: 0.06211390718817711\n", "Epoch 3829/10000, Training Loss: 0.049338653683662415, Test Loss: 0.06211529299616814\n", "Epoch 3830/10000, Training Loss: 0.04932885989546776, Test Loss: 0.062116362154483795\n", "Epoch 3831/10000, Training Loss: 0.049319077283144, Test Loss: 0.06211678311228752\n", "Epoch 3832/10000, Training Loss: 0.04930933192372322, Test Loss: 0.06211671233177185\n", "Epoch 3833/10000, Training Loss: 0.04929954558610916, Test Loss: 0.06211693957448006\n", "Epoch 3834/10000, Training Loss: 0.04928979277610779, Test Loss: 0.06211778521537781\n", "Epoch 3835/10000, Training Loss: 0.04928002133965492, Test Loss: 0.062119197100400925\n", "Epoch 3836/10000, Training Loss: 0.04927023500204086, Test Loss: 0.06212097033858299\n", "Epoch 3837/10000, Training Loss: 0.0492604561150074, Test Loss: 0.06212284043431282\n", "Epoch 3838/10000, Training Loss: 0.04925074428319931, Test Loss: 0.062124259769916534\n", "Epoch 3839/10000, Training Loss: 0.04924095422029495, Test Loss: 0.06212491914629936\n", "Epoch 3840/10000, Training Loss: 0.04923119395971298, Test Loss: 0.06212504580616951\n", "Epoch 3841/10000, Training Loss: 0.0492214560508728, Test Loss: 0.06212508678436279\n", "Epoch 3842/10000, Training Loss: 0.049211692065000534, Test Loss: 0.06212550401687622\n", "Epoch 3843/10000, Training Loss: 0.04920194670557976, Test Loss: 0.06212657317519188\n", "Epoch 3844/10000, Training Loss: 0.049192190170288086, Test Loss: 0.062128402292728424\n", "Epoch 3845/10000, Training Loss: 0.0491824708878994, Test Loss: 0.062130559235811234\n", "Epoch 3846/10000, Training Loss: 0.04917272552847862, Test Loss: 0.06213194504380226\n", "Epoch 3847/10000, Training Loss: 0.049162957817316055, Test Loss: 0.062132690101861954\n", "Epoch 3848/10000, Training Loss: 0.049153245985507965, Test Loss: 0.06213292479515076\n", "Epoch 3849/10000, Training Loss: 0.0491434670984745, Test Loss: 0.06213314086198807\n", "Epoch 3850/10000, Training Loss: 0.04913373664021492, Test Loss: 0.06213371828198433\n", "Epoch 3851/10000, Training Loss: 0.04912400618195534, Test Loss: 0.062134843319654465\n", "Epoch 3852/10000, Training Loss: 0.049114275723695755, Test Loss: 0.0621366947889328\n", "Epoch 3853/10000, Training Loss: 0.04910452663898468, Test Loss: 0.062138691544532776\n", "Epoch 3854/10000, Training Loss: 0.04909480735659599, Test Loss: 0.062139999121427536\n", "Epoch 3855/10000, Training Loss: 0.04908508434891701, Test Loss: 0.062140725553035736\n", "Epoch 3856/10000, Training Loss: 0.04907534271478653, Test Loss: 0.062141016125679016\n", "Epoch 3857/10000, Training Loss: 0.04906566068530083, Test Loss: 0.06214124709367752\n", "Epoch 3858/10000, Training Loss: 0.04905589297413826, Test Loss: 0.06214193254709244\n", "Epoch 3859/10000, Training Loss: 0.04904620721936226, Test Loss: 0.062143199145793915\n", "Epoch 3860/10000, Training Loss: 0.04903644695878029, Test Loss: 0.06214519590139389\n", "Epoch 3861/10000, Training Loss: 0.0490267239511013, Test Loss: 0.06214708462357521\n", "Epoch 3862/10000, Training Loss: 0.04901697859168053, Test Loss: 0.06214834377169609\n", "Epoch 3863/10000, Training Loss: 0.04900730401277542, Test Loss: 0.06214889511466026\n", "Epoch 3864/10000, Training Loss: 0.048997603356838226, Test Loss: 0.062149256467819214\n", "Epoch 3865/10000, Training Loss: 0.04898788407444954, Test Loss: 0.062149785459041595\n", "Epoch 3866/10000, Training Loss: 0.04897816851735115, Test Loss: 0.06215077266097069\n", "Epoch 3867/10000, Training Loss: 0.04896846041083336, Test Loss: 0.06215205416083336\n", "Epoch 3868/10000, Training Loss: 0.04895877093076706, Test Loss: 0.06215394288301468\n", "Epoch 3869/10000, Training Loss: 0.04894905909895897, Test Loss: 0.062155984342098236\n", "Epoch 3870/10000, Training Loss: 0.04893934354186058, Test Loss: 0.06215722858905792\n", "Epoch 3871/10000, Training Loss: 0.04892963543534279, Test Loss: 0.06215781345963478\n", "Epoch 3872/10000, Training Loss: 0.048919957131147385, Test Loss: 0.062158018350601196\n", "Epoch 3873/10000, Training Loss: 0.048910267651081085, Test Loss: 0.06215842068195343\n", "Epoch 3874/10000, Training Loss: 0.04890058934688568, Test Loss: 0.06215924024581909\n", "Epoch 3875/10000, Training Loss: 0.048890870064496994, Test Loss: 0.06216077134013176\n", "Epoch 3876/10000, Training Loss: 0.048881180584430695, Test Loss: 0.062162838876247406\n", "Epoch 3877/10000, Training Loss: 0.0488714724779129, Test Loss: 0.06216472014784813\n", "Epoch 3878/10000, Training Loss: 0.0488617867231369, Test Loss: 0.06216580793261528\n", "Epoch 3879/10000, Training Loss: 0.048852093517780304, Test Loss: 0.06216667965054512\n", "Epoch 3880/10000, Training Loss: 0.04884244501590729, Test Loss: 0.062167249619960785\n", "Epoch 3881/10000, Training Loss: 0.048832736909389496, Test Loss: 0.062167923897504807\n", "Epoch 3882/10000, Training Loss: 0.04882308468222618, Test Loss: 0.06216889247298241\n", "Epoch 3883/10000, Training Loss: 0.04881339147686958, Test Loss: 0.062170181423425674\n", "Epoch 3884/10000, Training Loss: 0.04880368709564209, Test Loss: 0.06217192858457565\n", "Epoch 3885/10000, Training Loss: 0.04879400506615639, Test Loss: 0.06217363849282265\n", "Epoch 3886/10000, Training Loss: 0.04878433793783188, Test Loss: 0.062174949795007706\n", "Epoch 3887/10000, Training Loss: 0.04877464845776558, Test Loss: 0.0621759332716465\n", "Epoch 3888/10000, Training Loss: 0.04876498878002167, Test Loss: 0.06217697262763977\n", "Epoch 3889/10000, Training Loss: 0.04875531792640686, Test Loss: 0.062177710235118866\n", "Epoch 3890/10000, Training Loss: 0.04874563589692116, Test Loss: 0.062178343534469604\n", "Epoch 3891/10000, Training Loss: 0.04873599857091904, Test Loss: 0.06217944994568825\n", "Epoch 3892/10000, Training Loss: 0.04872630909085274, Test Loss: 0.062180958688259125\n", "Epoch 3893/10000, Training Loss: 0.04871668666601181, Test Loss: 0.0621829479932785\n", "Epoch 3894/10000, Training Loss: 0.048707008361816406, Test Loss: 0.06218452379107475\n", "Epoch 3895/10000, Training Loss: 0.04869731888175011, Test Loss: 0.06218571215867996\n", "Epoch 3896/10000, Training Loss: 0.04868770018219948, Test Loss: 0.06218649446964264\n", "Epoch 3897/10000, Training Loss: 0.04867801442742348, Test Loss: 0.06218738481402397\n", "Epoch 3898/10000, Training Loss: 0.04866839200258255, Test Loss: 0.06218835338950157\n", "Epoch 3899/10000, Training Loss: 0.04865873232483864, Test Loss: 0.06218954175710678\n", "Epoch 3900/10000, Training Loss: 0.04864906892180443, Test Loss: 0.06219097226858139\n", "Epoch 3901/10000, Training Loss: 0.04863942414522171, Test Loss: 0.062192484736442566\n", "Epoch 3902/10000, Training Loss: 0.048629797995090485, Test Loss: 0.06219387799501419\n", "Epoch 3903/10000, Training Loss: 0.04862013831734657, Test Loss: 0.062195416539907455\n", "Epoch 3904/10000, Training Loss: 0.048610493540763855, Test Loss: 0.06219680607318878\n", "Epoch 3905/10000, Training Loss: 0.048600826412439346, Test Loss: 0.062197912484407425\n", "Epoch 3906/10000, Training Loss: 0.04859120398759842, Test Loss: 0.06219879165291786\n", "Epoch 3907/10000, Training Loss: 0.048581551760435104, Test Loss: 0.062199667096138\n", "Epoch 3908/10000, Training Loss: 0.04857189953327179, Test Loss: 0.062200725078582764\n", "Epoch 3909/10000, Training Loss: 0.04856228455901146, Test Loss: 0.06220199540257454\n", "Epoch 3910/10000, Training Loss: 0.04855268821120262, Test Loss: 0.062203578650951385\n", "Epoch 3911/10000, Training Loss: 0.0485430508852005, Test Loss: 0.06220531463623047\n", "Epoch 3912/10000, Training Loss: 0.04853338375687599, Test Loss: 0.06220680847764015\n", "Epoch 3913/10000, Training Loss: 0.04852379485964775, Test Loss: 0.06220805644989014\n", "Epoch 3914/10000, Training Loss: 0.04851412773132324, Test Loss: 0.062209416180849075\n", "Epoch 3915/10000, Training Loss: 0.0485045351088047, Test Loss: 0.06221082806587219\n", "Epoch 3916/10000, Training Loss: 0.04849490523338318, Test Loss: 0.062211774289608\n", "Epoch 3917/10000, Training Loss: 0.048485301434993744, Test Loss: 0.06221272796392441\n", "Epoch 3918/10000, Training Loss: 0.04847566410899162, Test Loss: 0.06221385300159454\n", "Epoch 3919/10000, Training Loss: 0.0484660267829895, Test Loss: 0.06221514567732811\n", "Epoch 3920/10000, Training Loss: 0.04845641925930977, Test Loss: 0.06221675127744675\n", "Epoch 3921/10000, Training Loss: 0.048446811735630035, Test Loss: 0.06221844255924225\n", "Epoch 3922/10000, Training Loss: 0.04843719303607941, Test Loss: 0.0622200146317482\n", "Epoch 3923/10000, Training Loss: 0.04842757433652878, Test Loss: 0.062221426516771317\n", "Epoch 3924/10000, Training Loss: 0.04841798543930054, Test Loss: 0.06222270429134369\n", "Epoch 3925/10000, Training Loss: 0.048408377915620804, Test Loss: 0.06222372129559517\n", "Epoch 3926/10000, Training Loss: 0.04839874431490898, Test Loss: 0.06222475692629814\n", "Epoch 3927/10000, Training Loss: 0.04838915169239044, Test Loss: 0.06222623214125633\n", "Epoch 3928/10000, Training Loss: 0.048379551619291306, Test Loss: 0.062227796763181686\n", "Epoch 3929/10000, Training Loss: 0.04836997017264366, Test Loss: 0.06222929060459137\n", "Epoch 3930/10000, Training Loss: 0.04836036264896393, Test Loss: 0.06223056837916374\n", "Epoch 3931/10000, Training Loss: 0.048350732773542404, Test Loss: 0.062231916934251785\n", "Epoch 3932/10000, Training Loss: 0.04834115505218506, Test Loss: 0.06223328411579132\n", "Epoch 3933/10000, Training Loss: 0.04833157733082771, Test Loss: 0.06223490461707115\n", "Epoch 3934/10000, Training Loss: 0.04832198843359947, Test Loss: 0.06223638355731964\n", "Epoch 3935/10000, Training Loss: 0.048312392085790634, Test Loss: 0.06223772093653679\n", "Epoch 3936/10000, Training Loss: 0.048302821815013885, Test Loss: 0.062238845974206924\n", "Epoch 3937/10000, Training Loss: 0.04829321801662445, Test Loss: 0.06223991885781288\n", "Epoch 3938/10000, Training Loss: 0.04828363656997681, Test Loss: 0.06224150210618973\n", "Epoch 3939/10000, Training Loss: 0.048274021595716476, Test Loss: 0.06224320828914642\n", "Epoch 3940/10000, Training Loss: 0.048264481127262115, Test Loss: 0.06224478408694267\n", "Epoch 3941/10000, Training Loss: 0.04825488105416298, Test Loss: 0.062246259301900864\n", "Epoch 3942/10000, Training Loss: 0.04824528843164444, Test Loss: 0.06224757060408592\n", "Epoch 3943/10000, Training Loss: 0.048235729336738586, Test Loss: 0.062248751521110535\n", "Epoch 3944/10000, Training Loss: 0.04822617024183273, Test Loss: 0.062249913811683655\n", "Epoch 3945/10000, Training Loss: 0.04821658506989479, Test Loss: 0.06225161999464035\n", "Epoch 3946/10000, Training Loss: 0.04820704832673073, Test Loss: 0.06225349381566048\n", "Epoch 3947/10000, Training Loss: 0.04819747433066368, Test Loss: 0.06225528195500374\n", "Epoch 3948/10000, Training Loss: 0.048187870532274246, Test Loss: 0.06225671246647835\n", "Epoch 3949/10000, Training Loss: 0.04817827790975571, Test Loss: 0.06225777417421341\n", "Epoch 3950/10000, Training Loss: 0.04816873371601105, Test Loss: 0.062258727848529816\n", "Epoch 3951/10000, Training Loss: 0.048159196972846985, Test Loss: 0.06225968152284622\n", "Epoch 3952/10000, Training Loss: 0.04814963415265083, Test Loss: 0.0622611939907074\n", "Epoch 3953/10000, Training Loss: 0.048140060156583786, Test Loss: 0.06226315721869469\n", "Epoch 3954/10000, Training Loss: 0.04813051223754883, Test Loss: 0.06226516142487526\n", "Epoch 3955/10000, Training Loss: 0.04812094196677208, Test Loss: 0.06226693466305733\n", "Epoch 3956/10000, Training Loss: 0.04811139777302742, Test Loss: 0.062268517911434174\n", "Epoch 3957/10000, Training Loss: 0.04810182750225067, Test Loss: 0.06226962432265282\n", "Epoch 3958/10000, Training Loss: 0.04809228703379631, Test Loss: 0.06227074936032295\n", "Epoch 3959/10000, Training Loss: 0.04808270186185837, Test Loss: 0.0622720792889595\n", "Epoch 3960/10000, Training Loss: 0.0480731837451458, Test Loss: 0.062273960560560226\n", "Epoch 3961/10000, Training Loss: 0.04806364327669144, Test Loss: 0.06227582320570946\n", "Epoch 3962/10000, Training Loss: 0.048054132610559464, Test Loss: 0.06227746233344078\n", "Epoch 3963/10000, Training Loss: 0.04804452881217003, Test Loss: 0.062278810888528824\n", "Epoch 3964/10000, Training Loss: 0.048035021871328354, Test Loss: 0.062280070036649704\n", "Epoch 3965/10000, Training Loss: 0.048025455325841904, Test Loss: 0.062281303107738495\n", "Epoch 3966/10000, Training Loss: 0.04801592975854874, Test Loss: 0.06228271499276161\n", "Epoch 3967/10000, Training Loss: 0.048006411641836166, Test Loss: 0.062284354120492935\n", "Epoch 3968/10000, Training Loss: 0.04799685254693031, Test Loss: 0.062286172062158585\n", "Epoch 3969/10000, Training Loss: 0.04798734933137894, Test Loss: 0.06228799745440483\n", "Epoch 3970/10000, Training Loss: 0.04797780141234398, Test Loss: 0.062289752066135406\n", "Epoch 3971/10000, Training Loss: 0.047968264669179916, Test Loss: 0.06229149550199509\n", "Epoch 3972/10000, Training Loss: 0.04795876145362854, Test Loss: 0.062292829155921936\n", "Epoch 3973/10000, Training Loss: 0.04794921725988388, Test Loss: 0.06229398027062416\n", "Epoch 3974/10000, Training Loss: 0.04793967306613922, Test Loss: 0.06229518726468086\n", "Epoch 3975/10000, Training Loss: 0.047930166125297546, Test Loss: 0.0622970312833786\n", "Epoch 3976/10000, Training Loss: 0.047920629382133484, Test Loss: 0.062299035489559174\n", "Epoch 3977/10000, Training Loss: 0.0479111485183239, Test Loss: 0.062300778925418854\n", "Epoch 3978/10000, Training Loss: 0.047901615500450134, Test Loss: 0.06230231001973152\n", "Epoch 3979/10000, Training Loss: 0.047892097383737564, Test Loss: 0.06230365112423897\n", "Epoch 3980/10000, Training Loss: 0.04788262024521828, Test Loss: 0.06230491027235985\n", "Epoch 3981/10000, Training Loss: 0.047873057425022125, Test Loss: 0.062306493520736694\n", "Epoch 3982/10000, Training Loss: 0.047863561660051346, Test Loss: 0.06230827793478966\n", "Epoch 3983/10000, Training Loss: 0.04785406216979027, Test Loss: 0.062310077250003815\n", "Epoch 3984/10000, Training Loss: 0.04784457013010979, Test Loss: 0.06231194734573364\n", "Epoch 3985/10000, Training Loss: 0.04783503711223602, Test Loss: 0.062313735485076904\n", "Epoch 3986/10000, Training Loss: 0.047825541347265244, Test Loss: 0.06231530383229256\n", "Epoch 3987/10000, Training Loss: 0.04781607910990715, Test Loss: 0.062316760420799255\n", "Epoch 3988/10000, Training Loss: 0.047806549817323685, Test Loss: 0.06231822445988655\n", "Epoch 3989/10000, Training Loss: 0.047797031700611115, Test Loss: 0.062319692224264145\n", "Epoch 3990/10000, Training Loss: 0.04778753221035004, Test Loss: 0.06232130154967308\n", "Epoch 3991/10000, Training Loss: 0.04777801036834717, Test Loss: 0.062323346734046936\n", "Epoch 3992/10000, Training Loss: 0.047768544405698776, Test Loss: 0.06232524290680885\n", "Epoch 3993/10000, Training Loss: 0.0477590411901474, Test Loss: 0.062326882034540176\n", "Epoch 3994/10000, Training Loss: 0.04774957150220871, Test Loss: 0.06232834607362747\n", "Epoch 3995/10000, Training Loss: 0.04774006828665733, Test Loss: 0.06232978776097298\n", "Epoch 3996/10000, Training Loss: 0.04773057624697685, Test Loss: 0.062331318855285645\n", "Epoch 3997/10000, Training Loss: 0.04772110655903816, Test Loss: 0.06233305484056473\n", "Epoch 3998/10000, Training Loss: 0.04771161451935768, Test Loss: 0.06233484297990799\n", "Epoch 3999/10000, Training Loss: 0.047702111303806305, Test Loss: 0.062336716800928116\n", "Epoch 4000/10000, Training Loss: 0.04769265279173851, Test Loss: 0.06233853101730347\n", "Epoch 4001/10000, Training Loss: 0.04768317937850952, Test Loss: 0.062340475618839264\n", "Epoch 4002/10000, Training Loss: 0.04767368361353874, Test Loss: 0.06234217435121536\n", "Epoch 4003/10000, Training Loss: 0.04766422137618065, Test Loss: 0.06234366074204445\n", "Epoch 4004/10000, Training Loss: 0.04765472933650017, Test Loss: 0.06234508007764816\n", "Epoch 4005/10000, Training Loss: 0.04764527454972267, Test Loss: 0.06234676390886307\n", "Epoch 4006/10000, Training Loss: 0.04763578251004219, Test Loss: 0.06234844774007797\n", "Epoch 4007/10000, Training Loss: 0.047626301646232605, Test Loss: 0.062350109219551086\n", "Epoch 4008/10000, Training Loss: 0.04761683940887451, Test Loss: 0.06235203891992569\n", "Epoch 4009/10000, Training Loss: 0.0476074144244194, Test Loss: 0.062354277819395065\n", "Epoch 4010/10000, Training Loss: 0.04759790748357773, Test Loss: 0.06235630065202713\n", "Epoch 4011/10000, Training Loss: 0.04758846387267113, Test Loss: 0.062357936054468155\n", "Epoch 4012/10000, Training Loss: 0.04757901281118393, Test Loss: 0.06235932558774948\n", "Epoch 4013/10000, Training Loss: 0.04756957292556763, Test Loss: 0.06236051023006439\n", "Epoch 4014/10000, Training Loss: 0.047560080885887146, Test Loss: 0.06236208602786064\n", "Epoch 4015/10000, Training Loss: 0.04755064472556114, Test Loss: 0.0623641200363636\n", "Epoch 4016/10000, Training Loss: 0.04754117503762245, Test Loss: 0.06236619874835014\n", "Epoch 4017/10000, Training Loss: 0.047531746327877045, Test Loss: 0.06236843019723892\n", "Epoch 4018/10000, Training Loss: 0.047522272914648056, Test Loss: 0.06237035244703293\n", "Epoch 4019/10000, Training Loss: 0.047512833029031754, Test Loss: 0.06237196549773216\n", "Epoch 4020/10000, Training Loss: 0.047503381967544556, Test Loss: 0.06237369030714035\n", "Epoch 4021/10000, Training Loss: 0.04749392345547676, Test Loss: 0.06237524002790451\n", "Epoch 4022/10000, Training Loss: 0.04748445376753807, Test Loss: 0.06237678602337837\n", "Epoch 4023/10000, Training Loss: 0.04747501388192177, Test Loss: 0.06237855181097984\n", "Epoch 4024/10000, Training Loss: 0.04746558144688606, Test Loss: 0.062380459159612656\n", "Epoch 4025/10000, Training Loss: 0.047456156462430954, Test Loss: 0.062382590025663376\n", "Epoch 4026/10000, Training Loss: 0.04744671657681465, Test Loss: 0.06238475814461708\n", "Epoch 4027/10000, Training Loss: 0.047437287867069244, Test Loss: 0.06238655000925064\n", "Epoch 4028/10000, Training Loss: 0.047427840530872345, Test Loss: 0.0623881071805954\n", "Epoch 4029/10000, Training Loss: 0.04741841182112694, Test Loss: 0.062389615923166275\n", "Epoch 4030/10000, Training Loss: 0.04740898311138153, Test Loss: 0.062391530722379684\n", "Epoch 4031/10000, Training Loss: 0.04739954322576523, Test Loss: 0.06239353120326996\n", "Epoch 4032/10000, Training Loss: 0.04739009588956833, Test Loss: 0.06239565461874008\n", "Epoch 4033/10000, Training Loss: 0.04738069698214531, Test Loss: 0.06239757686853409\n", "Epoch 4034/10000, Training Loss: 0.04737125337123871, Test Loss: 0.06239931285381317\n", "Epoch 4035/10000, Training Loss: 0.0473618283867836, Test Loss: 0.06240079179406166\n", "Epoch 4036/10000, Training Loss: 0.04735243692994118, Test Loss: 0.06240250915288925\n", "Epoch 4037/10000, Training Loss: 0.04734301194548607, Test Loss: 0.06240442767739296\n", "Epoch 4038/10000, Training Loss: 0.04733356088399887, Test Loss: 0.0624065026640892\n", "Epoch 4039/10000, Training Loss: 0.04732416942715645, Test Loss: 0.062408823519945145\n", "Epoch 4040/10000, Training Loss: 0.04731476306915283, Test Loss: 0.062410883605480194\n", "Epoch 4041/10000, Training Loss: 0.04730532690882683, Test Loss: 0.06241266801953316\n", "Epoch 4042/10000, Training Loss: 0.04729592800140381, Test Loss: 0.062414251267910004\n", "Epoch 4043/10000, Training Loss: 0.04728652164340019, Test Loss: 0.06241582706570625\n", "Epoch 4044/10000, Training Loss: 0.047277115285396576, Test Loss: 0.06241792067885399\n", "Epoch 4045/10000, Training Loss: 0.047267694026231766, Test Loss: 0.0624200738966465\n", "Epoch 4046/10000, Training Loss: 0.04725828021764755, Test Loss: 0.06242213398218155\n", "Epoch 4047/10000, Training Loss: 0.047248873859643936, Test Loss: 0.06242404133081436\n", "Epoch 4048/10000, Training Loss: 0.04723948985338211, Test Loss: 0.062424615025520325\n", "Epoch 4049/10000, Training Loss: 0.04723011329770088, Test Loss: 0.06242455169558525\n", "Epoch 4050/10000, Training Loss: 0.047220684587955475, Test Loss: 0.0624251551926136\n", "Epoch 4051/10000, Training Loss: 0.04721129685640335, Test Loss: 0.06242724880576134\n", "Epoch 4052/10000, Training Loss: 0.047201890498399734, Test Loss: 0.06243045628070831\n", "Epoch 4053/10000, Training Loss: 0.04719250276684761, Test Loss: 0.062433890998363495\n", "Epoch 4054/10000, Training Loss: 0.04718315601348877, Test Loss: 0.062436580657958984\n", "Epoch 4055/10000, Training Loss: 0.047173742204904556, Test Loss: 0.06243841350078583\n", "Epoch 4056/10000, Training Loss: 0.04716436564922333, Test Loss: 0.06243937835097313\n", "Epoch 4057/10000, Training Loss: 0.04715496301651001, Test Loss: 0.06244012340903282\n", "Epoch 4058/10000, Training Loss: 0.04714561253786087, Test Loss: 0.06244136765599251\n", "Epoch 4059/10000, Training Loss: 0.04713623598217964, Test Loss: 0.06244345381855965\n", "Epoch 4060/10000, Training Loss: 0.04712685942649841, Test Loss: 0.06244608387351036\n", "Epoch 4061/10000, Training Loss: 0.04711747542023659, Test Loss: 0.06244887784123421\n", "Epoch 4062/10000, Training Loss: 0.04710809141397476, Test Loss: 0.06245141848921776\n", "Epoch 4063/10000, Training Loss: 0.04709871858358383, Test Loss: 0.0624532513320446\n", "Epoch 4064/10000, Training Loss: 0.04708932712674141, Test Loss: 0.06245461851358414\n", "Epoch 4065/10000, Training Loss: 0.04707998037338257, Test Loss: 0.062455642968416214\n", "Epoch 4066/10000, Training Loss: 0.04707060009241104, Test Loss: 0.06245708465576172\n", "Epoch 4067/10000, Training Loss: 0.047061264514923096, Test Loss: 0.06245921552181244\n", "Epoch 4068/10000, Training Loss: 0.047051891684532166, Test Loss: 0.06246179714798927\n", "Epoch 4069/10000, Training Loss: 0.04704257473349571, Test Loss: 0.06246449053287506\n", "Epoch 4070/10000, Training Loss: 0.04703313484787941, Test Loss: 0.06246677413582802\n", "Epoch 4071/10000, Training Loss: 0.04702381789684296, Test Loss: 0.062468402087688446\n", "Epoch 4072/10000, Training Loss: 0.04701447859406471, Test Loss: 0.06246975064277649\n", "Epoch 4073/10000, Training Loss: 0.047005102038383484, Test Loss: 0.06247127428650856\n", "Epoch 4074/10000, Training Loss: 0.046995725482702255, Test Loss: 0.06247308850288391\n", "Epoch 4075/10000, Training Loss: 0.0469864159822464, Test Loss: 0.062475383281707764\n", "Epoch 4076/10000, Training Loss: 0.046977054327726364, Test Loss: 0.06247778609395027\n", "Epoch 4077/10000, Training Loss: 0.046967700123786926, Test Loss: 0.06248008832335472\n", "Epoch 4078/10000, Training Loss: 0.0469583123922348, Test Loss: 0.06248221546411514\n", "Epoch 4079/10000, Training Loss: 0.04694896191358566, Test Loss: 0.062484122812747955\n", "Epoch 4080/10000, Training Loss: 0.046939630061388016, Test Loss: 0.06248573958873749\n", "Epoch 4081/10000, Training Loss: 0.04693029820919037, Test Loss: 0.06248735263943672\n", "Epoch 4082/10000, Training Loss: 0.04692096263170242, Test Loss: 0.06248912960290909\n", "Epoch 4083/10000, Training Loss: 0.04691162332892418, Test Loss: 0.06249137967824936\n", "Epoch 4084/10000, Training Loss: 0.04690232127904892, Test Loss: 0.06249380111694336\n", "Epoch 4085/10000, Training Loss: 0.04689296334981918, Test Loss: 0.0624961331486702\n", "Epoch 4086/10000, Training Loss: 0.04688359424471855, Test Loss: 0.06249837204813957\n", "Epoch 4087/10000, Training Loss: 0.04687429964542389, Test Loss: 0.0625002384185791\n", "Epoch 4088/10000, Training Loss: 0.04686497151851654, Test Loss: 0.06250180304050446\n", "Epoch 4089/10000, Training Loss: 0.046855632215738297, Test Loss: 0.0625034049153328\n", "Epoch 4090/10000, Training Loss: 0.046846311539411545, Test Loss: 0.06250550597906113\n", "Epoch 4091/10000, Training Loss: 0.046836987137794495, Test Loss: 0.06250790506601334\n", "Epoch 4092/10000, Training Loss: 0.04682764410972595, Test Loss: 0.06251034885644913\n", "Epoch 4093/10000, Training Loss: 0.04681830108165741, Test Loss: 0.06251262873411179\n", "Epoch 4094/10000, Training Loss: 0.046809013932943344, Test Loss: 0.06251467019319534\n", "Epoch 4095/10000, Training Loss: 0.0467996709048748, Test Loss: 0.06251661479473114\n", "Epoch 4096/10000, Training Loss: 0.04679035022854805, Test Loss: 0.06251854449510574\n", "Epoch 4097/10000, Training Loss: 0.04678105190396309, Test Loss: 0.06252039223909378\n", "Epoch 4098/10000, Training Loss: 0.04677172377705574, Test Loss: 0.06252239644527435\n", "Epoch 4099/10000, Training Loss: 0.04676239192485809, Test Loss: 0.06252460181713104\n", "Epoch 4100/10000, Training Loss: 0.04675311967730522, Test Loss: 0.06252686679363251\n", "Epoch 4101/10000, Training Loss: 0.04674377292394638, Test Loss: 0.06252916902303696\n", "Epoch 4102/10000, Training Loss: 0.04673447087407112, Test Loss: 0.0625312328338623\n", "Epoch 4103/10000, Training Loss: 0.046725161373615265, Test Loss: 0.06253331899642944\n", "Epoch 4104/10000, Training Loss: 0.0467158779501915, Test Loss: 0.06253525614738464\n", "Epoch 4105/10000, Training Loss: 0.04670657590031624, Test Loss: 0.06253743171691895\n", "Epoch 4106/10000, Training Loss: 0.046697262674570084, Test Loss: 0.06253967434167862\n", "Epoch 4107/10000, Training Loss: 0.046687956899404526, Test Loss: 0.06254184246063232\n", "Epoch 4108/10000, Training Loss: 0.046678654849529266, Test Loss: 0.06254421919584274\n", "Epoch 4109/10000, Training Loss: 0.0466693639755249, Test Loss: 0.06254637986421585\n", "Epoch 4110/10000, Training Loss: 0.04666005074977875, Test Loss: 0.06254848092794418\n", "Epoch 4111/10000, Training Loss: 0.04665079340338707, Test Loss: 0.06255050748586655\n", "Epoch 4112/10000, Training Loss: 0.04664149880409241, Test Loss: 0.06255234032869339\n", "Epoch 4113/10000, Training Loss: 0.046632178127765656, Test Loss: 0.06255441159009933\n", "Epoch 4114/10000, Training Loss: 0.046622879803180695, Test Loss: 0.06255674362182617\n", "Epoch 4115/10000, Training Loss: 0.04661359637975693, Test Loss: 0.06255929917097092\n", "Epoch 4116/10000, Training Loss: 0.04660432040691376, Test Loss: 0.06256170570850372\n", "Epoch 4117/10000, Training Loss: 0.04659502953290939, Test Loss: 0.06256403774023056\n", "Epoch 4118/10000, Training Loss: 0.046585772186517715, Test Loss: 0.0625661090016365\n", "Epoch 4119/10000, Training Loss: 0.046576451510190964, Test Loss: 0.06256809085607529\n", "Epoch 4120/10000, Training Loss: 0.046567171812057495, Test Loss: 0.06257008761167526\n", "Epoch 4121/10000, Training Loss: 0.04655791074037552, Test Loss: 0.06257232278585434\n", "Epoch 4122/10000, Training Loss: 0.04654860869050026, Test Loss: 0.06257471442222595\n", "Epoch 4123/10000, Training Loss: 0.046539317816495895, Test Loss: 0.06257717311382294\n", "Epoch 4124/10000, Training Loss: 0.046530045568943024, Test Loss: 0.06257940828800201\n", "Epoch 4125/10000, Training Loss: 0.04652082175016403, Test Loss: 0.06258147209882736\n", "Epoch 4126/10000, Training Loss: 0.04651153087615967, Test Loss: 0.06258363276720047\n", "Epoch 4127/10000, Training Loss: 0.0465022511780262, Test Loss: 0.06258580833673477\n", "Epoch 4128/10000, Training Loss: 0.04649300128221512, Test Loss: 0.06258835643529892\n", "Epoch 4129/10000, Training Loss: 0.046483706682920456, Test Loss: 0.06259073317050934\n", "Epoch 4130/10000, Training Loss: 0.04647445306181908, Test Loss: 0.06259305775165558\n", "Epoch 4131/10000, Training Loss: 0.046465203166007996, Test Loss: 0.06259530037641525\n", "Epoch 4132/10000, Training Loss: 0.046455904841423035, Test Loss: 0.06259746849536896\n", "Epoch 4133/10000, Training Loss: 0.046446654945611954, Test Loss: 0.06259968131780624\n", "Epoch 4134/10000, Training Loss: 0.04643740504980087, Test Loss: 0.06260193884372711\n", "Epoch 4135/10000, Training Loss: 0.04642815887928009, Test Loss: 0.06260421872138977\n", "Epoch 4136/10000, Training Loss: 0.04641891270875931, Test Loss: 0.06260659545660019\n", "Epoch 4137/10000, Training Loss: 0.046409640461206436, Test Loss: 0.06260894238948822\n", "Epoch 4138/10000, Training Loss: 0.04640038684010506, Test Loss: 0.06261138617992401\n", "Epoch 4139/10000, Training Loss: 0.04639111086726189, Test Loss: 0.06261373311281204\n", "Epoch 4140/10000, Training Loss: 0.046381883323192596, Test Loss: 0.06261605024337769\n", "Epoch 4141/10000, Training Loss: 0.046372637152671814, Test Loss: 0.06261835992336273\n", "Epoch 4142/10000, Training Loss: 0.046363383531570435, Test Loss: 0.06262066215276718\n", "Epoch 4143/10000, Training Loss: 0.04635414481163025, Test Loss: 0.0626230388879776\n", "Epoch 4144/10000, Training Loss: 0.04634490981698036, Test Loss: 0.06262534111738205\n", "Epoch 4145/10000, Training Loss: 0.04633564129471779, Test Loss: 0.06262777000665665\n", "Epoch 4146/10000, Training Loss: 0.0463264137506485, Test Loss: 0.06263010948896408\n", "Epoch 4147/10000, Training Loss: 0.04631716385483742, Test Loss: 0.06263256072998047\n", "Epoch 4148/10000, Training Loss: 0.04630796238780022, Test Loss: 0.06263482570648193\n", "Epoch 4149/10000, Training Loss: 0.04629867896437645, Test Loss: 0.06263697147369385\n", "Epoch 4150/10000, Training Loss: 0.04628947377204895, Test Loss: 0.06263943761587143\n", "Epoch 4151/10000, Training Loss: 0.04628024250268936, Test Loss: 0.06264203041791916\n", "Epoch 4152/10000, Training Loss: 0.04627099633216858, Test Loss: 0.06264465302228928\n", "Epoch 4153/10000, Training Loss: 0.046261776238679886, Test Loss: 0.06264710426330566\n", "Epoch 4154/10000, Training Loss: 0.04625255614519119, Test Loss: 0.0626494511961937\n", "Epoch 4155/10000, Training Loss: 0.046243343502283096, Test Loss: 0.06265170872211456\n", "Epoch 4156/10000, Training Loss: 0.046234142035245895, Test Loss: 0.06265393644571304\n", "Epoch 4157/10000, Training Loss: 0.04622485116124153, Test Loss: 0.06265633553266525\n", "Epoch 4158/10000, Training Loss: 0.04621565714478493, Test Loss: 0.0626588836312294\n", "Epoch 4159/10000, Training Loss: 0.046206410974264145, Test Loss: 0.06266146153211594\n", "Epoch 4160/10000, Training Loss: 0.04619720205664635, Test Loss: 0.06266400963068008\n", "Epoch 4161/10000, Training Loss: 0.046187981963157654, Test Loss: 0.06266628205776215\n", "Epoch 4162/10000, Training Loss: 0.04617875814437866, Test Loss: 0.0626685693860054\n", "Epoch 4163/10000, Training Loss: 0.046169526875019073, Test Loss: 0.06267114728689194\n", "Epoch 4164/10000, Training Loss: 0.04616033658385277, Test Loss: 0.06267368793487549\n", "Epoch 4165/10000, Training Loss: 0.04615112021565437, Test Loss: 0.062676340341568\n", "Epoch 4166/10000, Training Loss: 0.04614190757274628, Test Loss: 0.06267885118722916\n", "Epoch 4167/10000, Training Loss: 0.04613272473216057, Test Loss: 0.06268123537302017\n", "Epoch 4168/10000, Training Loss: 0.046123504638671875, Test Loss: 0.06268361210823059\n", "Epoch 4169/10000, Training Loss: 0.04611431062221527, Test Loss: 0.06268598139286041\n", "Epoch 4170/10000, Training Loss: 0.046105071902275085, Test Loss: 0.06268851459026337\n", "Epoch 4171/10000, Training Loss: 0.04609588533639908, Test Loss: 0.06269114464521408\n", "Epoch 4172/10000, Training Loss: 0.046086713671684265, Test Loss: 0.06269384920597076\n", "Epoch 4173/10000, Training Loss: 0.04607749357819557, Test Loss: 0.06269630789756775\n", "Epoch 4174/10000, Training Loss: 0.04606829211115837, Test Loss: 0.06269858032464981\n", "Epoch 4175/10000, Training Loss: 0.04605910927057266, Test Loss: 0.06270106881856918\n", "Epoch 4176/10000, Training Loss: 0.04604988917708397, Test Loss: 0.06270356476306915\n", "Epoch 4177/10000, Training Loss: 0.046040721237659454, Test Loss: 0.06270632892847061\n", "Epoch 4178/10000, Training Loss: 0.04603153094649315, Test Loss: 0.06270895898342133\n", "Epoch 4179/10000, Training Loss: 0.04602231830358505, Test Loss: 0.06271164864301682\n", "Epoch 4180/10000, Training Loss: 0.04601312056183815, Test Loss: 0.06271412968635559\n", "Epoch 4181/10000, Training Loss: 0.04600391909480095, Test Loss: 0.06271663308143616\n", "Epoch 4182/10000, Training Loss: 0.04599476978182793, Test Loss: 0.06271901726722717\n", "Epoch 4183/10000, Training Loss: 0.045985572040081024, Test Loss: 0.06272146850824356\n", "Epoch 4184/10000, Training Loss: 0.04597640782594681, Test Loss: 0.06272418051958084\n", "Epoch 4185/10000, Training Loss: 0.045967187732458115, Test Loss: 0.06272683292627335\n", "Epoch 4186/10000, Training Loss: 0.045958008617162704, Test Loss: 0.06272938847541809\n", "Epoch 4187/10000, Training Loss: 0.04594886302947998, Test Loss: 0.06273208558559418\n", "Epoch 4188/10000, Training Loss: 0.04593967646360397, Test Loss: 0.0627346932888031\n", "Epoch 4189/10000, Training Loss: 0.04593050479888916, Test Loss: 0.06273730099201202\n", "Epoch 4190/10000, Training Loss: 0.04592132195830345, Test Loss: 0.06273975223302841\n", "Epoch 4191/10000, Training Loss: 0.04591212049126625, Test Loss: 0.06274225562810898\n", "Epoch 4192/10000, Training Loss: 0.04590301960706711, Test Loss: 0.06274498999118805\n", "Epoch 4193/10000, Training Loss: 0.0458938367664814, Test Loss: 0.06274767220020294\n", "Epoch 4194/10000, Training Loss: 0.04588470235466957, Test Loss: 0.0627504512667656\n", "Epoch 4195/10000, Training Loss: 0.04587547853589058, Test Loss: 0.06275326758623123\n", "Epoch 4196/10000, Training Loss: 0.045866306871175766, Test Loss: 0.06275593489408493\n", "Epoch 4197/10000, Training Loss: 0.04585718363523483, Test Loss: 0.06275836378335953\n", "Epoch 4198/10000, Training Loss: 0.04584798961877823, Test Loss: 0.06276077032089233\n", "Epoch 4199/10000, Training Loss: 0.045838840305805206, Test Loss: 0.0627632662653923\n", "Epoch 4200/10000, Training Loss: 0.04582968354225159, Test Loss: 0.0627659484744072\n", "Epoch 4201/10000, Training Loss: 0.04582056403160095, Test Loss: 0.06276886910200119\n", "Epoch 4202/10000, Training Loss: 0.04581138491630554, Test Loss: 0.06277178227901459\n", "Epoch 4203/10000, Training Loss: 0.045802194625139236, Test Loss: 0.06277471035718918\n", "Epoch 4204/10000, Training Loss: 0.04579305276274681, Test Loss: 0.0627773329615593\n", "Epoch 4205/10000, Training Loss: 0.045783914625644684, Test Loss: 0.06277962774038315\n", "Epoch 4206/10000, Training Loss: 0.045774754136800766, Test Loss: 0.06278214603662491\n", "Epoch 4207/10000, Training Loss: 0.04576561227440834, Test Loss: 0.06278471648693085\n", "Epoch 4208/10000, Training Loss: 0.04575648158788681, Test Loss: 0.06278720498085022\n", "Epoch 4209/10000, Training Loss: 0.0457473024725914, Test Loss: 0.06278949975967407\n", "Epoch 4210/10000, Training Loss: 0.04573817551136017, Test Loss: 0.06279212236404419\n", "Epoch 4211/10000, Training Loss: 0.045729056000709534, Test Loss: 0.0627950131893158\n", "Epoch 4212/10000, Training Loss: 0.045719921588897705, Test Loss: 0.0627979040145874\n", "Epoch 4213/10000, Training Loss: 0.045710768550634384, Test Loss: 0.06280085444450378\n", "Epoch 4214/10000, Training Loss: 0.04570162296295166, Test Loss: 0.0628034844994545\n", "Epoch 4215/10000, Training Loss: 0.045692481100559235, Test Loss: 0.0628061443567276\n", "Epoch 4216/10000, Training Loss: 0.0456833690404892, Test Loss: 0.06280869245529175\n", "Epoch 4217/10000, Training Loss: 0.045674216002225876, Test Loss: 0.06281152367591858\n", "Epoch 4218/10000, Training Loss: 0.04566509276628494, Test Loss: 0.0628143772482872\n", "Epoch 4219/10000, Training Loss: 0.045655976980924606, Test Loss: 0.06281716376543045\n", "Epoch 4220/10000, Training Loss: 0.045646827667951584, Test Loss: 0.06281989067792892\n", "Epoch 4221/10000, Training Loss: 0.04563770070672035, Test Loss: 0.06281987577676773\n", "Epoch 4222/10000, Training Loss: 0.045628610998392105, Test Loss: 0.0628192275762558\n", "Epoch 4223/10000, Training Loss: 0.04561948403716087, Test Loss: 0.06282001733779907\n", "Epoch 4224/10000, Training Loss: 0.04561038315296173, Test Loss: 0.06282362341880798\n", "Epoch 4225/10000, Training Loss: 0.0456012487411499, Test Loss: 0.06282912194728851\n", "Epoch 4226/10000, Training Loss: 0.04559212550520897, Test Loss: 0.06283450871706009\n", "Epoch 4227/10000, Training Loss: 0.04558298736810684, Test Loss: 0.06283800303936005\n", "Epoch 4228/10000, Training Loss: 0.04557391628623009, Test Loss: 0.06283926963806152\n", "Epoch 4229/10000, Training Loss: 0.045564815402030945, Test Loss: 0.06283943355083466\n", "Epoch 4230/10000, Training Loss: 0.045555710792541504, Test Loss: 0.06284046173095703\n", "Epoch 4231/10000, Training Loss: 0.04554658383131027, Test Loss: 0.06284327059984207\n", "Epoch 4232/10000, Training Loss: 0.04553750529885292, Test Loss: 0.06284786760807037\n", "Epoch 4233/10000, Training Loss: 0.04552839696407318, Test Loss: 0.06285266578197479\n", "Epoch 4234/10000, Training Loss: 0.04551929607987404, Test Loss: 0.06285630911588669\n", "Epoch 4235/10000, Training Loss: 0.045510195195674896, Test Loss: 0.06285830587148666\n", "Epoch 4236/10000, Training Loss: 0.045501112937927246, Test Loss: 0.06285939365625381\n", "Epoch 4237/10000, Training Loss: 0.045492030680179596, Test Loss: 0.0628606379032135\n", "Epoch 4238/10000, Training Loss: 0.045482926070690155, Test Loss: 0.06286309659481049\n", "Epoch 4239/10000, Training Loss: 0.04547378793358803, Test Loss: 0.06286686658859253\n", "Epoch 4240/10000, Training Loss: 0.04546474292874336, Test Loss: 0.06287112832069397\n", "Epoch 4241/10000, Training Loss: 0.045455630868673325, Test Loss: 0.06287457793951035\n", "Epoch 4242/10000, Training Loss: 0.04544654116034508, Test Loss: 0.0628770962357521\n", "Epoch 4243/10000, Training Loss: 0.045437466353178024, Test Loss: 0.06287892162799835\n", "Epoch 4244/10000, Training Loss: 0.04542835056781769, Test Loss: 0.0628807321190834\n", "Epoch 4245/10000, Training Loss: 0.045419320464134216, Test Loss: 0.06288322806358337\n", "Epoch 4246/10000, Training Loss: 0.045410215854644775, Test Loss: 0.06288651376962662\n", "Epoch 4247/10000, Training Loss: 0.045401137322187424, Test Loss: 0.06289024651050568\n", "Epoch 4248/10000, Training Loss: 0.04539204761385918, Test Loss: 0.06289377063512802\n", "Epoch 4249/10000, Training Loss: 0.045382995158433914, Test Loss: 0.06289654970169067\n", "Epoch 4250/10000, Training Loss: 0.04537392407655716, Test Loss: 0.06289876252412796\n", "Epoch 4251/10000, Training Loss: 0.04536483436822891, Test Loss: 0.06290078908205032\n", "Epoch 4252/10000, Training Loss: 0.04535575583577156, Test Loss: 0.0629032552242279\n", "Epoch 4253/10000, Training Loss: 0.04534668102860451, Test Loss: 0.06290633231401443\n", "Epoch 4254/10000, Training Loss: 0.04533759132027626, Test Loss: 0.06290984153747559\n", "Epoch 4255/10000, Training Loss: 0.0453285314142704, Test Loss: 0.06291309744119644\n", "Epoch 4256/10000, Training Loss: 0.04531948268413544, Test Loss: 0.06291599571704865\n", "Epoch 4257/10000, Training Loss: 0.04531040042638779, Test Loss: 0.06291840970516205\n", "Epoch 4258/10000, Training Loss: 0.045301321893930435, Test Loss: 0.06292087584733963\n", "Epoch 4259/10000, Training Loss: 0.04529228061437607, Test Loss: 0.06292363256216049\n", "Epoch 4260/10000, Training Loss: 0.04528321698307991, Test Loss: 0.06292679160833359\n", "Epoch 4261/10000, Training Loss: 0.04527414217591286, Test Loss: 0.06292998045682907\n", "Epoch 4262/10000, Training Loss: 0.04526510089635849, Test Loss: 0.06293307989835739\n", "Epoch 4263/10000, Training Loss: 0.04525604844093323, Test Loss: 0.062935970723629\n", "Epoch 4264/10000, Training Loss: 0.04524698108434677, Test Loss: 0.06293880194425583\n", "Epoch 4265/10000, Training Loss: 0.04523791745305061, Test Loss: 0.06294140219688416\n", "Epoch 4266/10000, Training Loss: 0.04522887244820595, Test Loss: 0.06294421851634979\n", "Epoch 4267/10000, Training Loss: 0.04521982744336128, Test Loss: 0.06294707953929901\n", "Epoch 4268/10000, Training Loss: 0.04521077126264572, Test Loss: 0.06295020133256912\n", "Epoch 4269/10000, Training Loss: 0.04520172253251076, Test Loss: 0.06295333802700043\n", "Epoch 4270/10000, Training Loss: 0.04519267380237579, Test Loss: 0.062956303358078\n", "Epoch 4271/10000, Training Loss: 0.04518362879753113, Test Loss: 0.06295904517173767\n", "Epoch 4272/10000, Training Loss: 0.04517460614442825, Test Loss: 0.0629618838429451\n", "Epoch 4273/10000, Training Loss: 0.0451655350625515, Test Loss: 0.06296491622924805\n", "Epoch 4274/10000, Training Loss: 0.045156486332416534, Test Loss: 0.06296810507774353\n", "Epoch 4275/10000, Training Loss: 0.045147474855184555, Test Loss: 0.06297119706869125\n", "Epoch 4276/10000, Training Loss: 0.045138437300920486, Test Loss: 0.0629742294549942\n", "Epoch 4277/10000, Training Loss: 0.045129384845495224, Test Loss: 0.06297691911458969\n", "Epoch 4278/10000, Training Loss: 0.04512035846710205, Test Loss: 0.0629797875881195\n", "Epoch 4279/10000, Training Loss: 0.04511133208870888, Test Loss: 0.0629827082157135\n", "Epoch 4280/10000, Training Loss: 0.045102301985025406, Test Loss: 0.06298578530550003\n", "Epoch 4281/10000, Training Loss: 0.04509324952960014, Test Loss: 0.06298903375864029\n", "Epoch 4282/10000, Training Loss: 0.04508422687649727, Test Loss: 0.06299223750829697\n", "Epoch 4283/10000, Training Loss: 0.045075222849845886, Test Loss: 0.06299512833356857\n", "Epoch 4284/10000, Training Loss: 0.045066192746162415, Test Loss: 0.06299804896116257\n", "Epoch 4285/10000, Training Loss: 0.04505711793899536, Test Loss: 0.06300108879804611\n", "Epoch 4286/10000, Training Loss: 0.045048151165246964, Test Loss: 0.06300409138202667\n", "Epoch 4287/10000, Training Loss: 0.045039109885692596, Test Loss: 0.06300714612007141\n", "Epoch 4288/10000, Training Loss: 0.045030128210783005, Test Loss: 0.06301026791334152\n", "Epoch 4289/10000, Training Loss: 0.04502106085419655, Test Loss: 0.0630134716629982\n", "Epoch 4290/10000, Training Loss: 0.04501207172870636, Test Loss: 0.06301651149988174\n", "Epoch 4291/10000, Training Loss: 0.0450030118227005, Test Loss: 0.0630195215344429\n", "Epoch 4292/10000, Training Loss: 0.044994011521339417, Test Loss: 0.06302254647016525\n", "Epoch 4293/10000, Training Loss: 0.044984977692365646, Test Loss: 0.06302566081285477\n", "Epoch 4294/10000, Training Loss: 0.044975992292165756, Test Loss: 0.0630287230014801\n", "Epoch 4295/10000, Training Loss: 0.04496699199080467, Test Loss: 0.06303184479475021\n", "Epoch 4296/10000, Training Loss: 0.0449579656124115, Test Loss: 0.0630350112915039\n", "Epoch 4297/10000, Training Loss: 0.04494895786046982, Test Loss: 0.06303822994232178\n", "Epoch 4298/10000, Training Loss: 0.04493998736143112, Test Loss: 0.06304138898849487\n", "Epoch 4299/10000, Training Loss: 0.04493096470832825, Test Loss: 0.06304454803466797\n", "Epoch 4300/10000, Training Loss: 0.04492194950580597, Test Loss: 0.06304749846458435\n", "Epoch 4301/10000, Training Loss: 0.044912923127412796, Test Loss: 0.06305062770843506\n", "Epoch 4302/10000, Training Loss: 0.0449039600789547, Test Loss: 0.0630536898970604\n", "Epoch 4303/10000, Training Loss: 0.04489496350288391, Test Loss: 0.06305697560310364\n", "Epoch 4304/10000, Training Loss: 0.044885940849781036, Test Loss: 0.06306030601263046\n", "Epoch 4305/10000, Training Loss: 0.04487696662545204, Test Loss: 0.06306342780590057\n", "Epoch 4306/10000, Training Loss: 0.04486793279647827, Test Loss: 0.06306666135787964\n", "Epoch 4307/10000, Training Loss: 0.04485895112156868, Test Loss: 0.06306964159011841\n", "Epoch 4308/10000, Training Loss: 0.04484995827078819, Test Loss: 0.06307273358106613\n", "Epoch 4309/10000, Training Loss: 0.044840965420007706, Test Loss: 0.06307598948478699\n", "Epoch 4310/10000, Training Loss: 0.0448320172727108, Test Loss: 0.063079334795475\n", "Epoch 4311/10000, Training Loss: 0.044822946190834045, Test Loss: 0.06308262050151825\n", "Epoch 4312/10000, Training Loss: 0.044814035296440125, Test Loss: 0.06308583915233612\n", "Epoch 4313/10000, Training Loss: 0.04480503499507904, Test Loss: 0.06308902055025101\n", "Epoch 4314/10000, Training Loss: 0.04479604959487915, Test Loss: 0.06309200823307037\n", "Epoch 4315/10000, Training Loss: 0.04478706046938896, Test Loss: 0.06309527158737183\n", "Epoch 4316/10000, Training Loss: 0.04477807506918907, Test Loss: 0.06309868395328522\n", "Epoch 4317/10000, Training Loss: 0.044769082218408585, Test Loss: 0.06310207396745682\n", "Epoch 4318/10000, Training Loss: 0.044760119169950485, Test Loss: 0.063103087246418\n", "Epoch 4319/10000, Training Loss: 0.04475117474794388, Test Loss: 0.06310339272022247\n", "Epoch 4320/10000, Training Loss: 0.0447421632707119, Test Loss: 0.06310524791479111\n", "Epoch 4321/10000, Training Loss: 0.0447331964969635, Test Loss: 0.06310955435037613\n", "Epoch 4322/10000, Training Loss: 0.0447242446243763, Test Loss: 0.0631152018904686\n", "Epoch 4323/10000, Training Loss: 0.044715266674757004, Test Loss: 0.06312055885791779\n", "Epoch 4324/10000, Training Loss: 0.0447063110768795, Test Loss: 0.06312399357557297\n", "Epoch 4325/10000, Training Loss: 0.04469730332493782, Test Loss: 0.06312556564807892\n", "Epoch 4326/10000, Training Loss: 0.04468836635351181, Test Loss: 0.06312684714794159\n", "Epoch 4327/10000, Training Loss: 0.044679414480924606, Test Loss: 0.06312908977270126\n", "Epoch 4328/10000, Training Loss: 0.04467044025659561, Test Loss: 0.06313298642635345\n", "Epoch 4329/10000, Training Loss: 0.0446615032851696, Test Loss: 0.06313807517290115\n", "Epoch 4330/10000, Training Loss: 0.044652532786130905, Test Loss: 0.06314271688461304\n", "Epoch 4331/10000, Training Loss: 0.04464355483651161, Test Loss: 0.06314602494239807\n", "Epoch 4332/10000, Training Loss: 0.0446346253156662, Test Loss: 0.06314831227064133\n", "Epoch 4333/10000, Training Loss: 0.044625673443078995, Test Loss: 0.06315024197101593\n", "Epoch 4334/10000, Training Loss: 0.044616710394620895, Test Loss: 0.06315293163061142\n", "Epoch 4335/10000, Training Loss: 0.04460781440138817, Test Loss: 0.0631566196680069\n", "Epoch 4336/10000, Training Loss: 0.04459882900118828, Test Loss: 0.06316117942333221\n", "Epoch 4337/10000, Training Loss: 0.044589847326278687, Test Loss: 0.06316543370485306\n", "Epoch 4338/10000, Training Loss: 0.044580940157175064, Test Loss: 0.06316884607076645\n", "Epoch 4339/10000, Training Loss: 0.044571977108716965, Test Loss: 0.06317134946584702\n", "Epoch 4340/10000, Training Loss: 0.04456305131316185, Test Loss: 0.06317377090454102\n", "Epoch 4341/10000, Training Loss: 0.04455408453941345, Test Loss: 0.0631766989827156\n", "Epoch 4342/10000, Training Loss: 0.044545162469148636, Test Loss: 0.06318029761314392\n", "Epoch 4343/10000, Training Loss: 0.04453622177243233, Test Loss: 0.06318432837724686\n", "Epoch 4344/10000, Training Loss: 0.04452725872397423, Test Loss: 0.063188336789608\n", "Epoch 4345/10000, Training Loss: 0.04451834782958031, Test Loss: 0.06319191306829453\n", "Epoch 4346/10000, Training Loss: 0.0445093996822834, Test Loss: 0.06319479644298553\n", "Epoch 4347/10000, Training Loss: 0.044500429183244705, Test Loss: 0.06319734454154968\n", "Epoch 4348/10000, Training Loss: 0.04449155926704407, Test Loss: 0.06320019066333771\n", "Epoch 4349/10000, Training Loss: 0.044482603669166565, Test Loss: 0.06320377439260483\n", "Epoch 4350/10000, Training Loss: 0.04447365179657936, Test Loss: 0.06320799142122269\n", "Epoch 4351/10000, Training Loss: 0.04446475952863693, Test Loss: 0.06321197748184204\n", "Epoch 4352/10000, Training Loss: 0.04445584490895271, Test Loss: 0.06321534514427185\n", "Epoch 4353/10000, Training Loss: 0.044446948915719986, Test Loss: 0.06321828067302704\n", "Epoch 4354/10000, Training Loss: 0.0444379597902298, Test Loss: 0.06322123110294342\n", "Epoch 4355/10000, Training Loss: 0.04442906007170677, Test Loss: 0.0632244274020195\n", "Epoch 4356/10000, Training Loss: 0.04442013427615166, Test Loss: 0.06322827190160751\n", "Epoch 4357/10000, Training Loss: 0.04441121965646744, Test Loss: 0.06323227286338806\n", "Epoch 4358/10000, Training Loss: 0.04440230876207352, Test Loss: 0.06323612481355667\n", "Epoch 4359/10000, Training Loss: 0.0443933866918087, Test Loss: 0.06323935836553574\n", "Epoch 4360/10000, Training Loss: 0.044384490698575974, Test Loss: 0.06324238330125809\n", "Epoch 4361/10000, Training Loss: 0.04437553137540817, Test Loss: 0.06324543803930283\n", "Epoch 4362/10000, Training Loss: 0.04436665400862694, Test Loss: 0.06324879825115204\n", "Epoch 4363/10000, Training Loss: 0.04435773938894272, Test Loss: 0.06325267255306244\n", "Epoch 4364/10000, Training Loss: 0.044348862022161484, Test Loss: 0.06325677037239075\n", "Epoch 4365/10000, Training Loss: 0.04433992877602577, Test Loss: 0.06326056271791458\n", "Epoch 4366/10000, Training Loss: 0.044331055134534836, Test Loss: 0.06326396018266678\n", "Epoch 4367/10000, Training Loss: 0.04432208836078644, Test Loss: 0.06326694786548615\n", "Epoch 4368/10000, Training Loss: 0.044313207268714905, Test Loss: 0.06327014416456223\n", "Epoch 4369/10000, Training Loss: 0.04430430009961128, Test Loss: 0.06327364593744278\n", "Epoch 4370/10000, Training Loss: 0.04429541528224945, Test Loss: 0.06327739357948303\n", "Epoch 4371/10000, Training Loss: 0.04428650438785553, Test Loss: 0.06328137964010239\n", "Epoch 4372/10000, Training Loss: 0.044277649372816086, Test Loss: 0.06328523904085159\n", "Epoch 4373/10000, Training Loss: 0.044268738478422165, Test Loss: 0.06328869611024857\n", "Epoch 4374/10000, Training Loss: 0.04425982013344765, Test Loss: 0.0632920041680336\n", "Epoch 4375/10000, Training Loss: 0.044250935316085815, Test Loss: 0.06329531967639923\n", "Epoch 4376/10000, Training Loss: 0.044242024421691895, Test Loss: 0.06329870223999023\n", "Epoch 4377/10000, Training Loss: 0.04423316940665245, Test Loss: 0.06330245733261108\n", "Epoch 4378/10000, Training Loss: 0.044224243611097336, Test Loss: 0.06330645829439163\n", "Epoch 4379/10000, Training Loss: 0.0442153625190258, Test Loss: 0.06331034004688263\n", "Epoch 4380/10000, Training Loss: 0.04420650377869606, Test Loss: 0.06331375241279602\n", "Epoch 4381/10000, Training Loss: 0.044197630137205124, Test Loss: 0.06331706047058105\n", "Epoch 4382/10000, Training Loss: 0.044188711792230606, Test Loss: 0.06332056224346161\n", "Epoch 4383/10000, Training Loss: 0.04417984187602997, Test Loss: 0.06332427263259888\n", "Epoch 4384/10000, Training Loss: 0.04417097941040993, Test Loss: 0.06332813948392868\n", "Epoch 4385/10000, Training Loss: 0.04416210949420929, Test Loss: 0.0633319690823555\n", "Epoch 4386/10000, Training Loss: 0.044153209775686264, Test Loss: 0.06333564966917038\n", "Epoch 4387/10000, Training Loss: 0.044144343584775925, Test Loss: 0.06333916634321213\n", "Epoch 4388/10000, Training Loss: 0.0441354475915432, Test Loss: 0.0633426085114479\n", "Epoch 4389/10000, Training Loss: 0.044126566499471664, Test Loss: 0.0633460134267807\n", "Epoch 4390/10000, Training Loss: 0.044117677956819534, Test Loss: 0.06334983557462692\n", "Epoch 4391/10000, Training Loss: 0.044108811765909195, Test Loss: 0.06335391104221344\n", "Epoch 4392/10000, Training Loss: 0.044099900871515274, Test Loss: 0.06335771828889847\n", "Epoch 4393/10000, Training Loss: 0.04409109428524971, Test Loss: 0.06336138397455215\n", "Epoch 4394/10000, Training Loss: 0.044082216918468475, Test Loss: 0.06336478888988495\n", "Epoch 4395/10000, Training Loss: 0.04407333582639694, Test Loss: 0.06336842477321625\n", "Epoch 4396/10000, Training Loss: 0.04406450316309929, Test Loss: 0.0633721798658371\n", "Epoch 4397/10000, Training Loss: 0.04405561834573746, Test Loss: 0.0633760318160057\n", "Epoch 4398/10000, Training Loss: 0.044046755880117416, Test Loss: 0.0633799210190773\n", "Epoch 4399/10000, Training Loss: 0.04403790086507797, Test Loss: 0.06338372081518173\n", "Epoch 4400/10000, Training Loss: 0.04402904212474823, Test Loss: 0.06338740140199661\n", "Epoch 4401/10000, Training Loss: 0.044020164757966995, Test Loss: 0.0633908212184906\n", "Epoch 4402/10000, Training Loss: 0.04401132091879845, Test Loss: 0.0633944422006607\n", "Epoch 4403/10000, Training Loss: 0.04400245472788811, Test Loss: 0.06339817494153976\n", "Epoch 4404/10000, Training Loss: 0.04399363696575165, Test Loss: 0.06340226531028748\n", "Epoch 4405/10000, Training Loss: 0.0439847856760025, Test Loss: 0.06340644508600235\n", "Epoch 4406/10000, Training Loss: 0.043975889682769775, Test Loss: 0.06341024488210678\n", "Epoch 4407/10000, Training Loss: 0.04396706447005272, Test Loss: 0.06341394037008286\n", "Epoch 4408/10000, Training Loss: 0.043958235532045364, Test Loss: 0.06341734528541565\n", "Epoch 4409/10000, Training Loss: 0.043949346989393234, Test Loss: 0.06342088431119919\n", "Epoch 4410/10000, Training Loss: 0.04394049942493439, Test Loss: 0.06342458724975586\n", "Epoch 4411/10000, Training Loss: 0.04393167421221733, Test Loss: 0.06342864036560059\n", "Epoch 4412/10000, Training Loss: 0.043922845274209976, Test Loss: 0.06343288719654083\n", "Epoch 4413/10000, Training Loss: 0.043914005160331726, Test Loss: 0.06343699246644974\n", "Epoch 4414/10000, Training Loss: 0.04390515014529228, Test Loss: 0.06344076991081238\n", "Epoch 4415/10000, Training Loss: 0.04389628767967224, Test Loss: 0.06344424933195114\n", "Epoch 4416/10000, Training Loss: 0.04388746991753578, Test Loss: 0.06344787031412125\n", "Epoch 4417/10000, Training Loss: 0.04387865588068962, Test Loss: 0.0634516105055809\n", "Epoch 4418/10000, Training Loss: 0.04386976733803749, Test Loss: 0.06345559656620026\n", "Epoch 4419/10000, Training Loss: 0.04386097937822342, Test Loss: 0.06345959007740021\n", "Epoch 4420/10000, Training Loss: 0.043852146714925766, Test Loss: 0.06346355378627777\n", "Epoch 4421/10000, Training Loss: 0.04384332150220871, Test Loss: 0.06346739083528519\n", "Epoch 4422/10000, Training Loss: 0.043834466487169266, Test Loss: 0.063471220433712\n", "Epoch 4423/10000, Training Loss: 0.04382561892271042, Test Loss: 0.06347517669200897\n", "Epoch 4424/10000, Training Loss: 0.043816834688186646, Test Loss: 0.06347920000553131\n", "Epoch 4425/10000, Training Loss: 0.04380800202488899, Test Loss: 0.0634830892086029\n", "Epoch 4426/10000, Training Loss: 0.04379918798804283, Test Loss: 0.06348682940006256\n", "Epoch 4427/10000, Training Loss: 0.043790385127067566, Test Loss: 0.06349079310894012\n", "Epoch 4428/10000, Training Loss: 0.043781548738479614, Test Loss: 0.0634947419166565\n", "Epoch 4429/10000, Training Loss: 0.04377267509698868, Test Loss: 0.06349867582321167\n", "Epoch 4430/10000, Training Loss: 0.04376388341188431, Test Loss: 0.06350254267454147\n", "Epoch 4431/10000, Training Loss: 0.04375506192445755, Test Loss: 0.06350632011890411\n", "Epoch 4432/10000, Training Loss: 0.0437462255358696, Test Loss: 0.06351031363010406\n", "Epoch 4433/10000, Training Loss: 0.043737415224313736, Test Loss: 0.0635143592953682\n", "Epoch 4434/10000, Training Loss: 0.04372863471508026, Test Loss: 0.06351854652166367\n", "Epoch 4435/10000, Training Loss: 0.043719809502363205, Test Loss: 0.06352269649505615\n", "Epoch 4436/10000, Training Loss: 0.04371100291609764, Test Loss: 0.06352664530277252\n", "Epoch 4437/10000, Training Loss: 0.043702177703380585, Test Loss: 0.06353039294481277\n", "Epoch 4438/10000, Training Loss: 0.0436934158205986, Test Loss: 0.0635342001914978\n", "Epoch 4439/10000, Training Loss: 0.04368457943201065, Test Loss: 0.06353802978992462\n", "Epoch 4440/10000, Training Loss: 0.043675780296325684, Test Loss: 0.06354201585054398\n", "Epoch 4441/10000, Training Loss: 0.04366694763302803, Test Loss: 0.06354624778032303\n", "Epoch 4442/10000, Training Loss: 0.04365818202495575, Test Loss: 0.0635504275560379\n", "Epoch 4443/10000, Training Loss: 0.0436493381857872, Test Loss: 0.06355457007884979\n", "Epoch 4444/10000, Training Loss: 0.04364057257771492, Test Loss: 0.06355855613946915\n", "Epoch 4445/10000, Training Loss: 0.04363176226615906, Test Loss: 0.06356257200241089\n", "Epoch 4446/10000, Training Loss: 0.04362295940518379, Test Loss: 0.06356632709503174\n", "Epoch 4447/10000, Training Loss: 0.043614160269498825, Test Loss: 0.06357016414403915\n", "Epoch 4448/10000, Training Loss: 0.04360533505678177, Test Loss: 0.0635741800069809\n", "Epoch 4449/10000, Training Loss: 0.04359659180045128, Test Loss: 0.06357840448617935\n", "Epoch 4450/10000, Training Loss: 0.04358775168657303, Test Loss: 0.0635828897356987\n", "Epoch 4451/10000, Training Loss: 0.04357895627617836, Test Loss: 0.06358712911605835\n", "Epoch 4452/10000, Training Loss: 0.04357021674513817, Test Loss: 0.06359118223190308\n", "Epoch 4453/10000, Training Loss: 0.04356139898300171, Test Loss: 0.06359488517045975\n", "Epoch 4454/10000, Training Loss: 0.04355261102318764, Test Loss: 0.06359865516424179\n", "Epoch 4455/10000, Training Loss: 0.04354386031627655, Test Loss: 0.06360259652137756\n", "Epoch 4456/10000, Training Loss: 0.04353504627943039, Test Loss: 0.06360689550638199\n", "Epoch 4457/10000, Training Loss: 0.04352625831961632, Test Loss: 0.06361144036054611\n", "Epoch 4458/10000, Training Loss: 0.04351747781038284, Test Loss: 0.0636158436536789\n", "Epoch 4459/10000, Training Loss: 0.04350869357585907, Test Loss: 0.06361986696720123\n", "Epoch 4460/10000, Training Loss: 0.0434999018907547, Test Loss: 0.06362370401620865\n", "Epoch 4461/10000, Training Loss: 0.04349111393094063, Test Loss: 0.06362750381231308\n", "Epoch 4462/10000, Training Loss: 0.043482378125190735, Test Loss: 0.0636313185095787\n", "Epoch 4463/10000, Training Loss: 0.043473538011312485, Test Loss: 0.06363563239574432\n", "Epoch 4464/10000, Training Loss: 0.04346476495265961, Test Loss: 0.0636402890086174\n", "Epoch 4465/10000, Training Loss: 0.04345603287220001, Test Loss: 0.06364487111568451\n", "Epoch 4466/10000, Training Loss: 0.04344727098941803, Test Loss: 0.06364902853965759\n", "Epoch 4467/10000, Training Loss: 0.043438490480184555, Test Loss: 0.06365282088518143\n", "Epoch 4468/10000, Training Loss: 0.04342973604798317, Test Loss: 0.06365662813186646\n", "Epoch 4469/10000, Training Loss: 0.04342097043991089, Test Loss: 0.06366061419248581\n", "Epoch 4470/10000, Training Loss: 0.04341217502951622, Test Loss: 0.06366488337516785\n", "Epoch 4471/10000, Training Loss: 0.04340342804789543, Test Loss: 0.06366943567991257\n", "Epoch 4472/10000, Training Loss: 0.04339462146162987, Test Loss: 0.06367391347885132\n", "Epoch 4473/10000, Training Loss: 0.043385885655879974, Test Loss: 0.06367816776037216\n", "Epoch 4474/10000, Training Loss: 0.04337712377309799, Test Loss: 0.06368224322795868\n", "Epoch 4475/10000, Training Loss: 0.04336834326386452, Test Loss: 0.06368625164031982\n", "Epoch 4476/10000, Training Loss: 0.04335959255695343, Test Loss: 0.0636901706457138\n", "Epoch 4477/10000, Training Loss: 0.04335084557533264, Test Loss: 0.06369440257549286\n", "Epoch 4478/10000, Training Loss: 0.04334208369255066, Test Loss: 0.06369879841804504\n", "Epoch 4479/10000, Training Loss: 0.04333335906267166, Test Loss: 0.06370339542627335\n", "Epoch 4480/10000, Training Loss: 0.04332456737756729, Test Loss: 0.06370794028043747\n", "Epoch 4481/10000, Training Loss: 0.04331585019826889, Test Loss: 0.06371207535266876\n", "Epoch 4482/10000, Training Loss: 0.04330707713961601, Test Loss: 0.06371603161096573\n", "Epoch 4483/10000, Training Loss: 0.043298330157995224, Test Loss: 0.06372014433145523\n", "Epoch 4484/10000, Training Loss: 0.043289586901664734, Test Loss: 0.06372427940368652\n", "Epoch 4485/10000, Training Loss: 0.04328083246946335, Test Loss: 0.06372858583927155\n", "Epoch 4486/10000, Training Loss: 0.04327206313610077, Test Loss: 0.0637330636382103\n", "Epoch 4487/10000, Training Loss: 0.043263331055641174, Test Loss: 0.0637376680970192\n", "Epoch 4488/10000, Training Loss: 0.043254610151052475, Test Loss: 0.06374210864305496\n", "Epoch 4489/10000, Training Loss: 0.04324584826827049, Test Loss: 0.06374634802341461\n", "Epoch 4490/10000, Training Loss: 0.04323713481426239, Test Loss: 0.06375039368867874\n", "Epoch 4491/10000, Training Loss: 0.0432283952832222, Test Loss: 0.06375443935394287\n", "Epoch 4492/10000, Training Loss: 0.04321965575218201, Test Loss: 0.06375853717327118\n", "Epoch 4493/10000, Training Loss: 0.04321089759469032, Test Loss: 0.0637630820274353\n", "Epoch 4494/10000, Training Loss: 0.04320215433835983, Test Loss: 0.06376788020133972\n", "Epoch 4495/10000, Training Loss: 0.04319344088435173, Test Loss: 0.06377258896827698\n", "Epoch 4496/10000, Training Loss: 0.043184660375118256, Test Loss: 0.06377687305212021\n", "Epoch 4497/10000, Training Loss: 0.043175969272851944, Test Loss: 0.06378086656332016\n", "Epoch 4498/10000, Training Loss: 0.04316719248890877, Test Loss: 0.06378494948148727\n", "Epoch 4499/10000, Training Loss: 0.043158482760190964, Test Loss: 0.06378915905952454\n", "Epoch 4500/10000, Training Loss: 0.04314977675676346, Test Loss: 0.06379368901252747\n", "Epoch 4501/10000, Training Loss: 0.04314102977514267, Test Loss: 0.06379827857017517\n", "Epoch 4502/10000, Training Loss: 0.043132320046424866, Test Loss: 0.06380295753479004\n", "Epoch 4503/10000, Training Loss: 0.04312359169125557, Test Loss: 0.06380732357501984\n", "Epoch 4504/10000, Training Loss: 0.04311487078666687, Test Loss: 0.06381163001060486\n", "Epoch 4505/10000, Training Loss: 0.043106094002723694, Test Loss: 0.06381578743457794\n", "Epoch 4506/10000, Training Loss: 0.04309741035103798, Test Loss: 0.06382019817829132\n", "Epoch 4507/10000, Training Loss: 0.04308870807290077, Test Loss: 0.06382466107606888\n", "Epoch 4508/10000, Training Loss: 0.04308001697063446, Test Loss: 0.06382925808429718\n", "Epoch 4509/10000, Training Loss: 0.043071284890174866, Test Loss: 0.06383384764194489\n", "Epoch 4510/10000, Training Loss: 0.04306256026029587, Test Loss: 0.06383822858333588\n", "Epoch 4511/10000, Training Loss: 0.04305384308099747, Test Loss: 0.0638425350189209\n", "Epoch 4512/10000, Training Loss: 0.04304513335227966, Test Loss: 0.06384686380624771\n", "Epoch 4513/10000, Training Loss: 0.04303642734885216, Test Loss: 0.06385136395692825\n", "Epoch 4514/10000, Training Loss: 0.04302772134542465, Test Loss: 0.06385602802038193\n", "Epoch 4515/10000, Training Loss: 0.04301900789141655, Test Loss: 0.06386048346757889\n", "Epoch 4516/10000, Training Loss: 0.04301028698682785, Test Loss: 0.0638650506734848\n", "Epoch 4517/10000, Training Loss: 0.04300157353281975, Test Loss: 0.06386937946081161\n", "Epoch 4518/10000, Training Loss: 0.04299284890294075, Test Loss: 0.06387382745742798\n", "Epoch 4519/10000, Training Loss: 0.042984191328287125, Test Loss: 0.0638783648610115\n", "Epoch 4520/10000, Training Loss: 0.04297550767660141, Test Loss: 0.06388295441865921\n", "Epoch 4521/10000, Training Loss: 0.04296676814556122, Test Loss: 0.06388753652572632\n", "Epoch 4522/10000, Training Loss: 0.042958058416843414, Test Loss: 0.06389208883047104\n", "Epoch 4523/10000, Training Loss: 0.042949363589286804, Test Loss: 0.0638965517282486\n", "Epoch 4524/10000, Training Loss: 0.04294069483876228, Test Loss: 0.06390086561441422\n", "Epoch 4525/10000, Training Loss: 0.04293198883533478, Test Loss: 0.06390540301799774\n", "Epoch 4526/10000, Training Loss: 0.04292331635951996, Test Loss: 0.06391013413667679\n", "Epoch 4527/10000, Training Loss: 0.04291457310318947, Test Loss: 0.06391485780477524\n", "Epoch 4528/10000, Training Loss: 0.04290590062737465, Test Loss: 0.06391944736242294\n", "Epoch 4529/10000, Training Loss: 0.04289720579981804, Test Loss: 0.06392379850149155\n", "Epoch 4530/10000, Training Loss: 0.042888522148132324, Test Loss: 0.06392819434404373\n", "Epoch 4531/10000, Training Loss: 0.04287980496883392, Test Loss: 0.06393282860517502\n", "Epoch 4532/10000, Training Loss: 0.042871128767728806, Test Loss: 0.06393755227327347\n", "Epoch 4533/10000, Training Loss: 0.0428624264895916, Test Loss: 0.06394225358963013\n", "Epoch 4534/10000, Training Loss: 0.04285374656319618, Test Loss: 0.06394673883914948\n", "Epoch 4535/10000, Training Loss: 0.04284508526325226, Test Loss: 0.06395137310028076\n", "Epoch 4536/10000, Training Loss: 0.04283641651272774, Test Loss: 0.06395602971315384\n", "Epoch 4537/10000, Training Loss: 0.042827729135751724, Test Loss: 0.06396066397428513\n", "Epoch 4538/10000, Training Loss: 0.0428190603852272, Test Loss: 0.06396529823541641\n", "Epoch 4539/10000, Training Loss: 0.042810384184122086, Test Loss: 0.06396985799074173\n", "Epoch 4540/10000, Training Loss: 0.04280169680714607, Test Loss: 0.06397433578968048\n", "Epoch 4541/10000, Training Loss: 0.04279303550720215, Test Loss: 0.06397884339094162\n", "Epoch 4542/10000, Training Loss: 0.04278435930609703, Test Loss: 0.06398368626832962\n", "Epoch 4543/10000, Training Loss: 0.042775675654411316, Test Loss: 0.06398862600326538\n", "Epoch 4544/10000, Training Loss: 0.04276697337627411, Test Loss: 0.06399347633123398\n", "Epoch 4545/10000, Training Loss: 0.04275830462574959, Test Loss: 0.06399816274642944\n", "Epoch 4546/10000, Training Loss: 0.04274965450167656, Test Loss: 0.06400256603956223\n", "Epoch 4547/10000, Training Loss: 0.04274100810289383, Test Loss: 0.06400687247514725\n", "Epoch 4548/10000, Training Loss: 0.042732320725917816, Test Loss: 0.06401137262582779\n", "Epoch 4549/10000, Training Loss: 0.042723674327135086, Test Loss: 0.06401637941598892\n", "Epoch 4550/10000, Training Loss: 0.04271501302719116, Test Loss: 0.06402156502008438\n", "Epoch 4551/10000, Training Loss: 0.04270632565021515, Test Loss: 0.06402638554573059\n", "Epoch 4552/10000, Training Loss: 0.04269763454794884, Test Loss: 0.06403086334466934\n", "Epoch 4553/10000, Training Loss: 0.04268902167677879, Test Loss: 0.06403527408838272\n", "Epoch 4554/10000, Training Loss: 0.04268033057451248, Test Loss: 0.06403984874486923\n", "Epoch 4555/10000, Training Loss: 0.04267168417572975, Test Loss: 0.06404484808444977\n", "Epoch 4556/10000, Training Loss: 0.042663026601076126, Test Loss: 0.06404873728752136\n", "Epoch 4557/10000, Training Loss: 0.0426543690264225, Test Loss: 0.06405209749937057\n", "Epoch 4558/10000, Training Loss: 0.04264570772647858, Test Loss: 0.06405642628669739\n", "Epoch 4559/10000, Training Loss: 0.04263707250356674, Test Loss: 0.06406186521053314\n", "Epoch 4560/10000, Training Loss: 0.04262841120362282, Test Loss: 0.0640677660703659\n", "Epoch 4561/10000, Training Loss: 0.042619746178388596, Test Loss: 0.06407317519187927\n", "Epoch 4562/10000, Training Loss: 0.042611151933670044, Test Loss: 0.06407742947340012\n", "Epoch 4563/10000, Training Loss: 0.04260246828198433, Test Loss: 0.0640811175107956\n", "Epoch 4564/10000, Training Loss: 0.0425938218832016, Test Loss: 0.06408515572547913\n", "Epoch 4565/10000, Training Loss: 0.04258517548441887, Test Loss: 0.06409024447202682\n", "Epoch 4566/10000, Training Loss: 0.04257657751441002, Test Loss: 0.06409609317779541\n", "Epoch 4567/10000, Training Loss: 0.0425679013133049, Test Loss: 0.06410160660743713\n", "Epoch 4568/10000, Training Loss: 0.042559243738651276, Test Loss: 0.06410623341798782\n", "Epoch 4569/10000, Training Loss: 0.04255061596632004, Test Loss: 0.06411030143499374\n", "Epoch 4570/10000, Training Loss: 0.042541977018117905, Test Loss: 0.06411416828632355\n", "Epoch 4571/10000, Training Loss: 0.04253334552049637, Test Loss: 0.06411884725093842\n", "Epoch 4572/10000, Training Loss: 0.04252472147345543, Test Loss: 0.06412427872419357\n", "Epoch 4573/10000, Training Loss: 0.0425160676240921, Test Loss: 0.06412994861602783\n", "Epoch 4574/10000, Training Loss: 0.04250744730234146, Test Loss: 0.06413525342941284\n", "Epoch 4575/10000, Training Loss: 0.04249880835413933, Test Loss: 0.06413964182138443\n", "Epoch 4576/10000, Training Loss: 0.04249017685651779, Test Loss: 0.06414365768432617\n", "Epoch 4577/10000, Training Loss: 0.042481549084186554, Test Loss: 0.06414791196584702\n", "Epoch 4578/10000, Training Loss: 0.04247292876243591, Test Loss: 0.06415281444787979\n", "Epoch 4579/10000, Training Loss: 0.04246428236365318, Test Loss: 0.0641583725810051\n", "Epoch 4580/10000, Training Loss: 0.04245567321777344, Test Loss: 0.06416391581296921\n", "Epoch 4581/10000, Training Loss: 0.04244702309370041, Test Loss: 0.06416898220777512\n", "Epoch 4582/10000, Training Loss: 0.042438432574272156, Test Loss: 0.06417340040206909\n", "Epoch 4583/10000, Training Loss: 0.04242977872490883, Test Loss: 0.06417762488126755\n", "Epoch 4584/10000, Training Loss: 0.04242115840315819, Test Loss: 0.06418226659297943\n", "Epoch 4585/10000, Training Loss: 0.04241254925727844, Test Loss: 0.06418731063604355\n", "Epoch 4586/10000, Training Loss: 0.042403947561979294, Test Loss: 0.06419261544942856\n", "Epoch 4587/10000, Training Loss: 0.042395319789648056, Test Loss: 0.06419782340526581\n", "Epoch 4588/10000, Training Loss: 0.042386725544929504, Test Loss: 0.06420279294252396\n", "Epoch 4589/10000, Training Loss: 0.042378079146146774, Test Loss: 0.0642075464129448\n", "Epoch 4590/10000, Training Loss: 0.042369481176137924, Test Loss: 0.06421217322349548\n", "Epoch 4591/10000, Training Loss: 0.04236086457967758, Test Loss: 0.06421695649623871\n", "Epoch 4592/10000, Training Loss: 0.04235222563147545, Test Loss: 0.06422188133001328\n", "Epoch 4593/10000, Training Loss: 0.042343635112047195, Test Loss: 0.06422699242830276\n", "Epoch 4594/10000, Training Loss: 0.042335037142038345, Test Loss: 0.06423217058181763\n", "Epoch 4595/10000, Training Loss: 0.04232648387551308, Test Loss: 0.06423734128475189\n", "Epoch 4596/10000, Training Loss: 0.04231780394911766, Test Loss: 0.0642421767115593\n", "Epoch 4597/10000, Training Loss: 0.042309194803237915, Test Loss: 0.06424689292907715\n", "Epoch 4598/10000, Training Loss: 0.04230063781142235, Test Loss: 0.06425151973962784\n", "Epoch 4599/10000, Training Loss: 0.04229201376438141, Test Loss: 0.06425661593675613\n", "Epoch 4600/10000, Training Loss: 0.04228341579437256, Test Loss: 0.0642620176076889\n", "Epoch 4601/10000, Training Loss: 0.042274825274944305, Test Loss: 0.06426738947629929\n", "Epoch 4602/10000, Training Loss: 0.04226623848080635, Test Loss: 0.06427228450775146\n", "Epoch 4603/10000, Training Loss: 0.0422576405107975, Test Loss: 0.06427682936191559\n", "Epoch 4604/10000, Training Loss: 0.04224900156259537, Test Loss: 0.06428150832653046\n", "Epoch 4605/10000, Training Loss: 0.04224040359258652, Test Loss: 0.06428665667772293\n", "Epoch 4606/10000, Training Loss: 0.04223185405135155, Test Loss: 0.06429213285446167\n", "Epoch 4607/10000, Training Loss: 0.04222322627902031, Test Loss: 0.06429743766784668\n", "Epoch 4608/10000, Training Loss: 0.04221467673778534, Test Loss: 0.06430245190858841\n", "Epoch 4609/10000, Training Loss: 0.04220610111951828, Test Loss: 0.0643070638179779\n", "Epoch 4610/10000, Training Loss: 0.04219747707247734, Test Loss: 0.06431180983781815\n", "Epoch 4611/10000, Training Loss: 0.042188890278339386, Test Loss: 0.06431695073843002\n", "Epoch 4612/10000, Training Loss: 0.04218028485774994, Test Loss: 0.06432247906923294\n", "Epoch 4613/10000, Training Loss: 0.04217172786593437, Test Loss: 0.06432788074016571\n", "Epoch 4614/10000, Training Loss: 0.042163122445344925, Test Loss: 0.06433301419019699\n", "Epoch 4615/10000, Training Loss: 0.04215458035469055, Test Loss: 0.0643378421664238\n", "Epoch 4616/10000, Training Loss: 0.042145997285842896, Test Loss: 0.06434257328510284\n", "Epoch 4617/10000, Training Loss: 0.04213739186525345, Test Loss: 0.06434743851423264\n", "Epoch 4618/10000, Training Loss: 0.04212880879640579, Test Loss: 0.06435270607471466\n", "Epoch 4619/10000, Training Loss: 0.042120251804590225, Test Loss: 0.06435808539390564\n", "Epoch 4620/10000, Training Loss: 0.042111657559871674, Test Loss: 0.06436344981193542\n", "Epoch 4621/10000, Training Loss: 0.04210310056805611, Test Loss: 0.06436868011951447\n", "Epoch 4622/10000, Training Loss: 0.042094506323337555, Test Loss: 0.064373679459095\n", "Epoch 4623/10000, Training Loss: 0.042085930705070496, Test Loss: 0.06437855958938599\n", "Epoch 4624/10000, Training Loss: 0.042077381163835526, Test Loss: 0.06438373029232025\n", "Epoch 4625/10000, Training Loss: 0.042068805545568466, Test Loss: 0.06438905745744705\n", "Epoch 4626/10000, Training Loss: 0.04206021502614021, Test Loss: 0.0643942654132843\n", "Epoch 4627/10000, Training Loss: 0.042051658034324646, Test Loss: 0.06439951807260513\n", "Epoch 4628/10000, Training Loss: 0.042043108493089676, Test Loss: 0.06440456956624985\n", "Epoch 4629/10000, Training Loss: 0.04203454777598381, Test Loss: 0.06440962105989456\n", "Epoch 4630/10000, Training Loss: 0.04202600196003914, Test Loss: 0.06441466510295868\n", "Epoch 4631/10000, Training Loss: 0.0420173779129982, Test Loss: 0.06441988795995712\n", "Epoch 4632/10000, Training Loss: 0.042008865624666214, Test Loss: 0.06442540138959885\n", "Epoch 4633/10000, Training Loss: 0.04200030118227005, Test Loss: 0.06443094462156296\n", "Epoch 4634/10000, Training Loss: 0.041991740465164185, Test Loss: 0.06443612277507782\n", "Epoch 4635/10000, Training Loss: 0.04198315739631653, Test Loss: 0.06444112956523895\n", "Epoch 4636/10000, Training Loss: 0.041974615305662155, Test Loss: 0.06444616615772247\n", "Epoch 4637/10000, Training Loss: 0.04196606203913689, Test Loss: 0.06445132941007614\n", "Epoch 4638/10000, Training Loss: 0.04195750504732132, Test Loss: 0.06445682793855667\n", "Epoch 4639/10000, Training Loss: 0.04194895550608635, Test Loss: 0.06446225196123123\n", "Epoch 4640/10000, Training Loss: 0.041940413415431976, Test Loss: 0.06446744501590729\n", "Epoch 4641/10000, Training Loss: 0.0419318787753582, Test Loss: 0.06447253376245499\n", "Epoch 4642/10000, Training Loss: 0.04192327335476875, Test Loss: 0.0644776001572609\n", "Epoch 4643/10000, Training Loss: 0.04191473871469498, Test Loss: 0.06448308378458023\n", "Epoch 4644/10000, Training Loss: 0.0419062077999115, Test Loss: 0.06448861211538315\n", "Epoch 4645/10000, Training Loss: 0.041897643357515335, Test Loss: 0.0644940584897995\n", "Epoch 4646/10000, Training Loss: 0.04188910499215126, Test Loss: 0.06449934840202332\n", "Epoch 4647/10000, Training Loss: 0.041880588978528976, Test Loss: 0.06450444459915161\n", "Epoch 4648/10000, Training Loss: 0.04187200963497162, Test Loss: 0.06450963765382767\n", "Epoch 4649/10000, Training Loss: 0.04186349734663963, Test Loss: 0.06451503187417984\n", "Epoch 4650/10000, Training Loss: 0.041854940354824066, Test Loss: 0.06452050060033798\n", "Epoch 4651/10000, Training Loss: 0.041846439242362976, Test Loss: 0.06452584266662598\n", "Epoch 4652/10000, Training Loss: 0.04183788225054741, Test Loss: 0.06453104317188263\n", "Epoch 4653/10000, Training Loss: 0.04182934761047363, Test Loss: 0.06453624367713928\n", "Epoch 4654/10000, Training Loss: 0.041820839047431946, Test Loss: 0.06454166024923325\n", "Epoch 4655/10000, Training Loss: 0.041812293231487274, Test Loss: 0.06454727798700333\n", "Epoch 4656/10000, Training Loss: 0.0418037474155426, Test Loss: 0.06455286592245102\n", "Epoch 4657/10000, Training Loss: 0.04179523512721062, Test Loss: 0.06455817818641663\n", "Epoch 4658/10000, Training Loss: 0.04178670793771744, Test Loss: 0.06456343829631805\n", "Epoch 4659/10000, Training Loss: 0.041778139770030975, Test Loss: 0.06456872075796127\n", "Epoch 4660/10000, Training Loss: 0.04176962003111839, Test Loss: 0.06457413733005524\n", "Epoch 4661/10000, Training Loss: 0.04176110029220581, Test Loss: 0.06457962095737457\n", "Epoch 4662/10000, Training Loss: 0.04175258055329323, Test Loss: 0.06458499282598495\n", "Epoch 4663/10000, Training Loss: 0.04174407199025154, Test Loss: 0.06459047645330429\n", "Epoch 4664/10000, Training Loss: 0.04173550754785538, Test Loss: 0.06459596008062363\n", "Epoch 4665/10000, Training Loss: 0.04172702506184578, Test Loss: 0.06460151076316833\n", "Epoch 4666/10000, Training Loss: 0.041718486696481705, Test Loss: 0.06460682302713394\n", "Epoch 4667/10000, Training Loss: 0.04170996695756912, Test Loss: 0.06461205333471298\n", "Epoch 4668/10000, Training Loss: 0.041701480746269226, Test Loss: 0.06461760401725769\n", "Epoch 4669/10000, Training Loss: 0.04169292002916336, Test Loss: 0.06462323665618896\n", "Epoch 4670/10000, Training Loss: 0.04168446734547615, Test Loss: 0.06462891399860382\n", "Epoch 4671/10000, Training Loss: 0.04167595133185387, Test Loss: 0.06463443487882614\n", "Epoch 4672/10000, Training Loss: 0.0416673980653286, Test Loss: 0.06463981419801712\n", "Epoch 4673/10000, Training Loss: 0.04165887460112572, Test Loss: 0.06464501470327377\n", "Epoch 4674/10000, Training Loss: 0.04165036976337433, Test Loss: 0.06465031951665878\n", "Epoch 4675/10000, Training Loss: 0.04164190590381622, Test Loss: 0.06465594470500946\n", "Epoch 4676/10000, Training Loss: 0.041633378714323044, Test Loss: 0.06466194987297058\n", "Epoch 4677/10000, Training Loss: 0.04162488505244255, Test Loss: 0.06466762721538544\n", "Epoch 4678/10000, Training Loss: 0.04161635786294937, Test Loss: 0.06467298418283463\n", "Epoch 4679/10000, Training Loss: 0.041607845574617386, Test Loss: 0.06467819213867188\n", "Epoch 4680/10000, Training Loss: 0.0415993258357048, Test Loss: 0.06468354910612106\n", "Epoch 4681/10000, Training Loss: 0.04159083217382431, Test Loss: 0.06468930840492249\n", "Epoch 4682/10000, Training Loss: 0.04158235713839531, Test Loss: 0.06469511240720749\n", "Epoch 4683/10000, Training Loss: 0.04157383367419243, Test Loss: 0.06470084935426712\n", "Epoch 4684/10000, Training Loss: 0.04156535863876343, Test Loss: 0.06470634788274765\n", "Epoch 4685/10000, Training Loss: 0.04155684635043144, Test Loss: 0.06471167504787445\n", "Epoch 4686/10000, Training Loss: 0.04154834896326065, Test Loss: 0.06471693515777588\n", "Epoch 4687/10000, Training Loss: 0.04153986647725105, Test Loss: 0.06472258269786835\n", "Epoch 4688/10000, Training Loss: 0.04153134301304817, Test Loss: 0.06472854316234589\n", "Epoch 4689/10000, Training Loss: 0.041522856801748276, Test Loss: 0.06473440676927567\n", "Epoch 4690/10000, Training Loss: 0.041514381766319275, Test Loss: 0.06474011391401291\n", "Epoch 4691/10000, Training Loss: 0.041505880653858185, Test Loss: 0.06474535167217255\n", "Epoch 4692/10000, Training Loss: 0.041497401893138885, Test Loss: 0.0647461861371994\n", "Epoch 4693/10000, Training Loss: 0.04148893058300018, Test Loss: 0.06474722176790237\n", "Epoch 4694/10000, Training Loss: 0.041480448096990585, Test Loss: 0.06475277990102768\n", "Epoch 4695/10000, Training Loss: 0.04147198423743248, Test Loss: 0.06476224213838577\n", "Epoch 4696/10000, Training Loss: 0.04146352782845497, Test Loss: 0.0647716075181961\n", "Epoch 4697/10000, Training Loss: 0.041455063968896866, Test Loss: 0.06477741152048111\n", "Epoch 4698/10000, Training Loss: 0.04144655913114548, Test Loss: 0.06477977335453033\n", "Epoch 4699/10000, Training Loss: 0.041438110172748566, Test Loss: 0.06478177011013031\n", "Epoch 4700/10000, Training Loss: 0.041429609060287476, Test Loss: 0.06478671729564667\n", "Epoch 4701/10000, Training Loss: 0.04142114892601967, Test Loss: 0.06479477137327194\n", "Epoch 4702/10000, Training Loss: 0.04141265153884888, Test Loss: 0.06480307132005692\n", "Epoch 4703/10000, Training Loss: 0.04140419512987137, Test Loss: 0.06480930000543594\n", "Epoch 4704/10000, Training Loss: 0.04139573127031326, Test Loss: 0.06481253355741501\n", "Epoch 4705/10000, Training Loss: 0.04138723760843277, Test Loss: 0.06481508910655975\n", "Epoch 4706/10000, Training Loss: 0.041378773748874664, Test Loss: 0.06481973826885223\n", "Epoch 4707/10000, Training Loss: 0.04137035831809044, Test Loss: 0.06483139097690582\n", "Epoch 4708/10000, Training Loss: 0.04136188328266144, Test Loss: 0.06484384089708328\n", "Epoch 4709/10000, Training Loss: 0.04135343059897423, Test Loss: 0.06485066562891006\n", "Epoch 4710/10000, Training Loss: 0.041344933211803436, Test Loss: 0.06485109031200409\n", "Epoch 4711/10000, Training Loss: 0.04133648797869682, Test Loss: 0.06485053151845932\n", "Epoch 4712/10000, Training Loss: 0.041328027844429016, Test Loss: 0.06485473364591599\n", "Epoch 4713/10000, Training Loss: 0.0413195937871933, Test Loss: 0.06486447155475616\n", "Epoch 4714/10000, Training Loss: 0.04131116718053818, Test Loss: 0.06487566232681274\n", "Epoch 4715/10000, Training Loss: 0.04130270704627037, Test Loss: 0.06487837433815002\n", "Epoch 4716/10000, Training Loss: 0.04129420965909958, Test Loss: 0.06487622857093811\n", "Epoch 4717/10000, Training Loss: 0.041285768151283264, Test Loss: 0.06487688422203064\n", "Epoch 4718/10000, Training Loss: 0.041277334094047546, Test Loss: 0.064884714782238\n", "Epoch 4719/10000, Training Loss: 0.04126890003681183, Test Loss: 0.06489699333906174\n", "Epoch 4720/10000, Training Loss: 0.04126043617725372, Test Loss: 0.06490709632635117\n", "Epoch 4721/10000, Training Loss: 0.041251976042985916, Test Loss: 0.06491558998823166\n", "Epoch 4722/10000, Training Loss: 0.04124350845813751, Test Loss: 0.06491999328136444\n", "Epoch 4723/10000, Training Loss: 0.04123508557677269, Test Loss: 0.0649218037724495\n", "Epoch 4724/10000, Training Loss: 0.04122662916779518, Test Loss: 0.06492491066455841\n", "Epoch 4725/10000, Training Loss: 0.041218213737010956, Test Loss: 0.06492716073989868\n", "Epoch 4726/10000, Training Loss: 0.04120975360274315, Test Loss: 0.06493199616670609\n", "Epoch 4727/10000, Training Loss: 0.04120131582021713, Test Loss: 0.06493984907865524\n", "Epoch 4728/10000, Training Loss: 0.04119286686182022, Test Loss: 0.06494849920272827\n", "Epoch 4729/10000, Training Loss: 0.04118445888161659, Test Loss: 0.06495499610900879\n", "Epoch 4730/10000, Training Loss: 0.04117601364850998, Test Loss: 0.06496331840753555\n", "Epoch 4731/10000, Training Loss: 0.041167594492435455, Test Loss: 0.06497085839509964\n", "Epoch 4732/10000, Training Loss: 0.04115915670990944, Test Loss: 0.06497614830732346\n", "Epoch 4733/10000, Training Loss: 0.041150737553834915, Test Loss: 0.06497985869646072\n", "Epoch 4734/10000, Training Loss: 0.0411422923207283, Test Loss: 0.0649796798825264\n", "Epoch 4735/10000, Training Loss: 0.041133854538202286, Test Loss: 0.06498147547245026\n", "Epoch 4736/10000, Training Loss: 0.04112543910741806, Test Loss: 0.0649886429309845\n", "Epoch 4737/10000, Training Loss: 0.04111698642373085, Test Loss: 0.06499965488910675\n", "Epoch 4738/10000, Training Loss: 0.041108570992946625, Test Loss: 0.06501354277133942\n", "Epoch 4739/10000, Training Loss: 0.041100144386291504, Test Loss: 0.06502248346805573\n", "Epoch 4740/10000, Training Loss: 0.041091736406087875, Test Loss: 0.06501979380846024\n", "Epoch 4741/10000, Training Loss: 0.041083335876464844, Test Loss: 0.06501438468694687\n", "Epoch 4742/10000, Training Loss: 0.04107489064335823, Test Loss: 0.06502153724431992\n", "Epoch 4743/10000, Training Loss: 0.04106647148728371, Test Loss: 0.06503865867853165\n", "Epoch 4744/10000, Training Loss: 0.0410580188035965, Test Loss: 0.0650501474738121\n", "Epoch 4745/10000, Training Loss: 0.04104962199926376, Test Loss: 0.06505120545625687\n", "Epoch 4746/10000, Training Loss: 0.04104122519493103, Test Loss: 0.06504746526479721\n", "Epoch 4747/10000, Training Loss: 0.041032757610082626, Test Loss: 0.06505240499973297\n", "Epoch 4748/10000, Training Loss: 0.04102437198162079, Test Loss: 0.06506621092557907\n", "Epoch 4749/10000, Training Loss: 0.041015975177288055, Test Loss: 0.06507646292448044\n", "Epoch 4750/10000, Training Loss: 0.04100758209824562, Test Loss: 0.06507940590381622\n", "Epoch 4751/10000, Training Loss: 0.04099918529391289, Test Loss: 0.06507854908704758\n", "Epoch 4752/10000, Training Loss: 0.04099074751138687, Test Loss: 0.06508483737707138\n", "Epoch 4753/10000, Training Loss: 0.040982335805892944, Test Loss: 0.06509717553853989\n", "Epoch 4754/10000, Training Loss: 0.04097395017743111, Test Loss: 0.06510470807552338\n", "Epoch 4755/10000, Training Loss: 0.04096551984548569, Test Loss: 0.06510657072067261\n", "Epoch 4756/10000, Training Loss: 0.040957096964120865, Test Loss: 0.06511127948760986\n", "Epoch 4757/10000, Training Loss: 0.04094872623682022, Test Loss: 0.0651157945394516\n", "Epoch 4758/10000, Training Loss: 0.0409403070807457, Test Loss: 0.06512540578842163\n", "Epoch 4759/10000, Training Loss: 0.04093192145228386, Test Loss: 0.06513220816850662\n", "Epoch 4760/10000, Training Loss: 0.040923502296209335, Test Loss: 0.06513545662164688\n", "Epoch 4761/10000, Training Loss: 0.0409151054918766, Test Loss: 0.06513833999633789\n", "Epoch 4762/10000, Training Loss: 0.04090670123696327, Test Loss: 0.06514798849821091\n", "Epoch 4763/10000, Training Loss: 0.04089832678437233, Test Loss: 0.06516025960445404\n", "Epoch 4764/10000, Training Loss: 0.04088990390300751, Test Loss: 0.06516493856906891\n", "Epoch 4765/10000, Training Loss: 0.040881551802158356, Test Loss: 0.06516404449939728\n", "Epoch 4766/10000, Training Loss: 0.040873147547245026, Test Loss: 0.06516866385936737\n", "Epoch 4767/10000, Training Loss: 0.040864747017621994, Test Loss: 0.06517971307039261\n", "Epoch 4768/10000, Training Loss: 0.04085637629032135, Test Loss: 0.06518778204917908\n", "Epoch 4769/10000, Training Loss: 0.040847983211278915, Test Loss: 0.06519129127264023\n", "Epoch 4770/10000, Training Loss: 0.04083957523107529, Test Loss: 0.0651930645108223\n", "Epoch 4771/10000, Training Loss: 0.040831197053194046, Test Loss: 0.06519749015569687\n", "Epoch 4772/10000, Training Loss: 0.04082280769944191, Test Loss: 0.06520979106426239\n", "Epoch 4773/10000, Training Loss: 0.040814436972141266, Test Loss: 0.06521919369697571\n", "Epoch 4774/10000, Training Loss: 0.04080604016780853, Test Loss: 0.06522715091705322\n", "Epoch 4775/10000, Training Loss: 0.04079766198992729, Test Loss: 0.06522783637046814\n", "Epoch 4776/10000, Training Loss: 0.04078926146030426, Test Loss: 0.06523130089044571\n", "Epoch 4777/10000, Training Loss: 0.04078090190887451, Test Loss: 0.0652356818318367\n", "Epoch 4778/10000, Training Loss: 0.040772534906864166, Test Loss: 0.06524257361888885\n", "Epoch 4779/10000, Training Loss: 0.04076412692666054, Test Loss: 0.06525503098964691\n", "Epoch 4780/10000, Training Loss: 0.04075578600168228, Test Loss: 0.06526236981153488\n", "Epoch 4781/10000, Training Loss: 0.040747396647930145, Test Loss: 0.06526188552379608\n", "Epoch 4782/10000, Training Loss: 0.04073904827237129, Test Loss: 0.06526459008455276\n", "Epoch 4783/10000, Training Loss: 0.04073066636919975, Test Loss: 0.06527362018823624\n", "Epoch 4784/10000, Training Loss: 0.0407223105430603, Test Loss: 0.06528167426586151\n", "Epoch 4785/10000, Training Loss: 0.040713950991630554, Test Loss: 0.06528680771589279\n", "Epoch 4786/10000, Training Loss: 0.04070558398962021, Test Loss: 0.06529057770967484\n", "Epoch 4787/10000, Training Loss: 0.040697209537029266, Test Loss: 0.06529521197080612\n", "Epoch 4788/10000, Training Loss: 0.04068886861205101, Test Loss: 0.06530231982469559\n", "Epoch 4789/10000, Training Loss: 0.04068049415946007, Test Loss: 0.06531060487031937\n", "Epoch 4790/10000, Training Loss: 0.04067215695977211, Test Loss: 0.06531792134046555\n", "Epoch 4791/10000, Training Loss: 0.04066379368305206, Test Loss: 0.06532341986894608\n", "Epoch 4792/10000, Training Loss: 0.040655456483364105, Test Loss: 0.06533178687095642\n", "Epoch 4793/10000, Training Loss: 0.04064703732728958, Test Loss: 0.06534048914909363\n", "Epoch 4794/10000, Training Loss: 0.04063873738050461, Test Loss: 0.06534728407859802\n", "Epoch 4795/10000, Training Loss: 0.04063039645552635, Test Loss: 0.065347820520401\n", "Epoch 4796/10000, Training Loss: 0.0406220406293869, Test Loss: 0.06534838676452637\n", "Epoch 4797/10000, Training Loss: 0.04061370715498924, Test Loss: 0.06535429507493973\n", "Epoch 4798/10000, Training Loss: 0.0406053327023983, Test Loss: 0.06536554545164108\n", "Epoch 4799/10000, Training Loss: 0.04059699550271034, Test Loss: 0.06537659466266632\n", "Epoch 4800/10000, Training Loss: 0.04058867320418358, Test Loss: 0.06538276374340057\n", "Epoch 4801/10000, Training Loss: 0.040580328553915024, Test Loss: 0.065384641289711\n", "Epoch 4802/10000, Training Loss: 0.04057193920016289, Test Loss: 0.0653868317604065\n", "Epoch 4803/10000, Training Loss: 0.04056369140744209, Test Loss: 0.06539729237556458\n", "Epoch 4804/10000, Training Loss: 0.04055530205368996, Test Loss: 0.06541112065315247\n", "Epoch 4805/10000, Training Loss: 0.0405469685792923, Test Loss: 0.06541674584150314\n", "Epoch 4806/10000, Training Loss: 0.04053860157728195, Test Loss: 0.0654156282544136\n", "Epoch 4807/10000, Training Loss: 0.0405302569270134, Test Loss: 0.06541561335325241\n", "Epoch 4808/10000, Training Loss: 0.04052193835377693, Test Loss: 0.06542275100946426\n", "Epoch 4809/10000, Training Loss: 0.040513619780540466, Test Loss: 0.06543543189764023\n", "Epoch 4810/10000, Training Loss: 0.0405053049325943, Test Loss: 0.06545095145702362\n", "Epoch 4811/10000, Training Loss: 0.04049695283174515, Test Loss: 0.06545594334602356\n", "Epoch 4812/10000, Training Loss: 0.04048861190676689, Test Loss: 0.06545307487249374\n", "Epoch 4813/10000, Training Loss: 0.040480323135852814, Test Loss: 0.06545591354370117\n", "Epoch 4814/10000, Training Loss: 0.04047197848558426, Test Loss: 0.06546347588300705\n", "Epoch 4815/10000, Training Loss: 0.0404636524617672, Test Loss: 0.06547416001558304\n", "Epoch 4816/10000, Training Loss: 0.040455322712659836, Test Loss: 0.06548324227333069\n", "Epoch 4817/10000, Training Loss: 0.040446992963552475, Test Loss: 0.06548843532800674\n", "Epoch 4818/10000, Training Loss: 0.04043867066502571, Test Loss: 0.06549113243818283\n", "Epoch 4819/10000, Training Loss: 0.04043032228946686, Test Loss: 0.06549905985593796\n", "Epoch 4820/10000, Training Loss: 0.040422022342681885, Test Loss: 0.06551029533147812\n", "Epoch 4821/10000, Training Loss: 0.040413741022348404, Test Loss: 0.0655159130692482\n", "Epoch 4822/10000, Training Loss: 0.04040543735027313, Test Loss: 0.06551741063594818\n", "Epoch 4823/10000, Training Loss: 0.04039706662297249, Test Loss: 0.06551992148160934\n", "Epoch 4824/10000, Training Loss: 0.04038877785205841, Test Loss: 0.06552721560001373\n", "Epoch 4825/10000, Training Loss: 0.040380463004112244, Test Loss: 0.06553811579942703\n", "Epoch 4826/10000, Training Loss: 0.04037213325500488, Test Loss: 0.06554792821407318\n", "Epoch 4827/10000, Training Loss: 0.040363844484090805, Test Loss: 0.06555332988500595\n", "Epoch 4828/10000, Training Loss: 0.04035549983382225, Test Loss: 0.06555578112602234\n", "Epoch 4829/10000, Training Loss: 0.04034721851348877, Test Loss: 0.06556322425603867\n", "Epoch 4830/10000, Training Loss: 0.04033888876438141, Test Loss: 0.06557434052228928\n", "Epoch 4831/10000, Training Loss: 0.040330637246370316, Test Loss: 0.06558052450418472\n", "Epoch 4832/10000, Training Loss: 0.04032227769494057, Test Loss: 0.0655825138092041\n", "Epoch 4833/10000, Training Loss: 0.04031399264931679, Test Loss: 0.06558520346879959\n", "Epoch 4834/10000, Training Loss: 0.040305666625499725, Test Loss: 0.0655924379825592\n", "Epoch 4835/10000, Training Loss: 0.04029739275574684, Test Loss: 0.06560303270816803\n", "Epoch 4836/10000, Training Loss: 0.04028908908367157, Test Loss: 0.06561256945133209\n", "Epoch 4837/10000, Training Loss: 0.04028075560927391, Test Loss: 0.06561809033155441\n", "Epoch 4838/10000, Training Loss: 0.04027247428894043, Test Loss: 0.06562099605798721\n", "Epoch 4839/10000, Training Loss: 0.040264178067445755, Test Loss: 0.06562909483909607\n", "Epoch 4840/10000, Training Loss: 0.04025588184595108, Test Loss: 0.0656404048204422\n", "Epoch 4841/10000, Training Loss: 0.04024758189916611, Test Loss: 0.06564617156982422\n", "Epoch 4842/10000, Training Loss: 0.040239300578832626, Test Loss: 0.0656476616859436\n", "Epoch 4843/10000, Training Loss: 0.040230993181467056, Test Loss: 0.06565042585134506\n", "Epoch 4844/10000, Training Loss: 0.04022270068526268, Test Loss: 0.06565816700458527\n", "Epoch 4845/10000, Training Loss: 0.04021439701318741, Test Loss: 0.06566930562257767\n", "Epoch 4846/10000, Training Loss: 0.040206123143434525, Test Loss: 0.06567885726690292\n", "Epoch 4847/10000, Training Loss: 0.040197838097810745, Test Loss: 0.06568411737680435\n", "Epoch 4848/10000, Training Loss: 0.04018951579928398, Test Loss: 0.06568702310323715\n", "Epoch 4849/10000, Training Loss: 0.04018126428127289, Test Loss: 0.06569137424230576\n", "Epoch 4850/10000, Training Loss: 0.040172968059778214, Test Loss: 0.06570320576429367\n", "Epoch 4851/10000, Training Loss: 0.04016467183828354, Test Loss: 0.06571651995182037\n", "Epoch 4852/10000, Training Loss: 0.040156394243240356, Test Loss: 0.0657208263874054\n", "Epoch 4853/10000, Training Loss: 0.04014810174703598, Test Loss: 0.06571947783231735\n", "Epoch 4854/10000, Training Loss: 0.040139857679605484, Test Loss: 0.06572099775075912\n", "Epoch 4855/10000, Training Loss: 0.040131568908691406, Test Loss: 0.06573031842708588\n", "Epoch 4856/10000, Training Loss: 0.040123265236616135, Test Loss: 0.0657440572977066\n", "Epoch 4857/10000, Training Loss: 0.04011502116918564, Test Loss: 0.06575465947389603\n", "Epoch 4858/10000, Training Loss: 0.04010673612356186, Test Loss: 0.06575863808393478\n", "Epoch 4859/10000, Training Loss: 0.04009845107793808, Test Loss: 0.06575948745012283\n", "Epoch 4860/10000, Training Loss: 0.040090158581733704, Test Loss: 0.06576333194971085\n", "Epoch 4861/10000, Training Loss: 0.0400819331407547, Test Loss: 0.06577657908201218\n", "Epoch 4862/10000, Training Loss: 0.04007364809513092, Test Loss: 0.06578797847032547\n", "Epoch 4863/10000, Training Loss: 0.040065377950668335, Test Loss: 0.06579315662384033\n", "Epoch 4864/10000, Training Loss: 0.040057096630334854, Test Loss: 0.06579416990280151\n", "Epoch 4865/10000, Training Loss: 0.04004884883761406, Test Loss: 0.06579739600419998\n", "Epoch 4866/10000, Training Loss: 0.04004056006669998, Test Loss: 0.06580964475870132\n", "Epoch 4867/10000, Training Loss: 0.04003230482339859, Test Loss: 0.06582098454236984\n", "Epoch 4868/10000, Training Loss: 0.04002401605248451, Test Loss: 0.06582679599523544\n", "Epoch 4869/10000, Training Loss: 0.04001575708389282, Test Loss: 0.06582886725664139\n", "Epoch 4870/10000, Training Loss: 0.040007516741752625, Test Loss: 0.06583244353532791\n", "Epoch 4871/10000, Training Loss: 0.03999927267432213, Test Loss: 0.06584062427282333\n", "Epoch 4872/10000, Training Loss: 0.03999098017811775, Test Loss: 0.06585150212049484\n", "Epoch 4873/10000, Training Loss: 0.03998275846242905, Test Loss: 0.06586075574159622\n", "Epoch 4874/10000, Training Loss: 0.03997446596622467, Test Loss: 0.0658661425113678\n", "Epoch 4875/10000, Training Loss: 0.039966195821762085, Test Loss: 0.06587308645248413\n", "Epoch 4876/10000, Training Loss: 0.03995797410607338, Test Loss: 0.06588131934404373\n", "Epoch 4877/10000, Training Loss: 0.0399496853351593, Test Loss: 0.0658857449889183\n", "Epoch 4878/10000, Training Loss: 0.039941467344760895, Test Loss: 0.06588934361934662\n", "Epoch 4879/10000, Training Loss: 0.03993317857384682, Test Loss: 0.06589551270008087\n", "Epoch 4880/10000, Training Loss: 0.03992494195699692, Test Loss: 0.06590479612350464\n", "Epoch 4881/10000, Training Loss: 0.03991669788956642, Test Loss: 0.06591453403234482\n", "Epoch 4882/10000, Training Loss: 0.03990846127271652, Test Loss: 0.06592174619436264\n", "Epoch 4883/10000, Training Loss: 0.03990023210644722, Test Loss: 0.06592649966478348\n", "Epoch 4884/10000, Training Loss: 0.03989197313785553, Test Loss: 0.06593126058578491\n", "Epoch 4885/10000, Training Loss: 0.03988371416926384, Test Loss: 0.06593815982341766\n", "Epoch 4886/10000, Training Loss: 0.03987543657422066, Test Loss: 0.06594717502593994\n", "Epoch 4887/10000, Training Loss: 0.039867233484983444, Test Loss: 0.06595584005117416\n", "Epoch 4888/10000, Training Loss: 0.03985900059342384, Test Loss: 0.06596259772777557\n", "Epoch 4889/10000, Training Loss: 0.039850737899541855, Test Loss: 0.06596772372722626\n", "Epoch 4890/10000, Training Loss: 0.03984250873327255, Test Loss: 0.06597333401441574\n", "Epoch 4891/10000, Training Loss: 0.03983427584171295, Test Loss: 0.06598058342933655\n", "Epoch 4892/10000, Training Loss: 0.03982599824666977, Test Loss: 0.06599263846874237\n", "Epoch 4893/10000, Training Loss: 0.03981780260801315, Test Loss: 0.06600040942430496\n", "Epoch 4894/10000, Training Loss: 0.03980957344174385, Test Loss: 0.06600355356931686\n", "Epoch 4895/10000, Training Loss: 0.03980131447315216, Test Loss: 0.06600654870271683\n", "Epoch 4896/10000, Training Loss: 0.039793118834495544, Test Loss: 0.06601357460021973\n", "Epoch 4897/10000, Training Loss: 0.039784885942935944, Test Loss: 0.06602422893047333\n", "Epoch 4898/10000, Training Loss: 0.03977663442492485, Test Loss: 0.06603442132472992\n", "Epoch 4899/10000, Training Loss: 0.03976843133568764, Test Loss: 0.0660443976521492\n", "Epoch 4900/10000, Training Loss: 0.03976016864180565, Test Loss: 0.06604767590761185\n", "Epoch 4901/10000, Training Loss: 0.03975195437669754, Test Loss: 0.06604887545108795\n", "Epoch 4902/10000, Training Loss: 0.03974373638629913, Test Loss: 0.06605446338653564\n", "Epoch 4903/10000, Training Loss: 0.039735496044158936, Test Loss: 0.06606534123420715\n", "Epoch 4904/10000, Training Loss: 0.039727259427309036, Test Loss: 0.06607715785503387\n", "Epoch 4905/10000, Training Loss: 0.03971906006336212, Test Loss: 0.0660848394036293\n", "Epoch 4906/10000, Training Loss: 0.039710815995931625, Test Loss: 0.06608835607767105\n", "Epoch 4907/10000, Training Loss: 0.039702631533145905, Test Loss: 0.0660918727517128\n", "Epoch 4908/10000, Training Loss: 0.03969438374042511, Test Loss: 0.06609904021024704\n", "Epoch 4909/10000, Training Loss: 0.039686158299446106, Test Loss: 0.06610970199108124\n", "Epoch 4910/10000, Training Loss: 0.03967797011137009, Test Loss: 0.06611965596675873\n", "Epoch 4911/10000, Training Loss: 0.03966975584626198, Test Loss: 0.06612610071897507\n", "Epoch 4912/10000, Training Loss: 0.03966153413057327, Test Loss: 0.06612998992204666\n", "Epoch 4913/10000, Training Loss: 0.039653316140174866, Test Loss: 0.06613858044147491\n", "Epoch 4914/10000, Training Loss: 0.03964509442448616, Test Loss: 0.06614653021097183\n", "Epoch 4915/10000, Training Loss: 0.03963689133524895, Test Loss: 0.06615302711725235\n", "Epoch 4916/10000, Training Loss: 0.039628662168979645, Test Loss: 0.06615888327360153\n", "Epoch 4917/10000, Training Loss: 0.039620473980903625, Test Loss: 0.06616561114788055\n", "Epoch 4918/10000, Training Loss: 0.0396122932434082, Test Loss: 0.06617357581853867\n", "Epoch 4919/10000, Training Loss: 0.0396040603518486, Test Loss: 0.06618203967809677\n", "Epoch 4920/10000, Training Loss: 0.039595894515514374, Test Loss: 0.06618964672088623\n", "Epoch 4921/10000, Training Loss: 0.039587680250406265, Test Loss: 0.06619613617658615\n", "Epoch 4922/10000, Training Loss: 0.03957943245768547, Test Loss: 0.06620234251022339\n", "Epoch 4923/10000, Training Loss: 0.039571259170770645, Test Loss: 0.06620937585830688\n", "Epoch 4924/10000, Training Loss: 0.039563052356243134, Test Loss: 0.06621745228767395\n", "Epoch 4925/10000, Training Loss: 0.03955487534403801, Test Loss: 0.06622543185949326\n", "Epoch 4926/10000, Training Loss: 0.039546698331832886, Test Loss: 0.06623262912034988\n", "Epoch 4927/10000, Training Loss: 0.03953845426440239, Test Loss: 0.06623920798301697\n", "Epoch 4928/10000, Training Loss: 0.039530280977487564, Test Loss: 0.06624942272901535\n", "Epoch 4929/10000, Training Loss: 0.03952209651470184, Test Loss: 0.06625644117593765\n", "Epoch 4930/10000, Training Loss: 0.03951390087604523, Test Loss: 0.066260926425457\n", "Epoch 4931/10000, Training Loss: 0.03950570523738861, Test Loss: 0.06626606732606888\n", "Epoch 4932/10000, Training Loss: 0.03949747979640961, Test Loss: 0.06627418845891953\n", "Epoch 4933/10000, Training Loss: 0.03948928788304329, Test Loss: 0.06628423184156418\n", "Epoch 4934/10000, Training Loss: 0.039481133222579956, Test Loss: 0.06629305332899094\n", "Epoch 4935/10000, Training Loss: 0.03947294130921364, Test Loss: 0.06629925221204758\n", "Epoch 4936/10000, Training Loss: 0.03946472331881523, Test Loss: 0.06630422919988632\n", "Epoch 4937/10000, Training Loss: 0.03945652395486832, Test Loss: 0.06631073355674744\n", "Epoch 4938/10000, Training Loss: 0.03944837301969528, Test Loss: 0.06631936877965927\n", "Epoch 4939/10000, Training Loss: 0.03944019228219986, Test Loss: 0.0663287565112114\n", "Epoch 4940/10000, Training Loss: 0.039432015269994736, Test Loss: 0.0663367286324501\n", "Epoch 4941/10000, Training Loss: 0.03942381590604782, Test Loss: 0.06634273380041122\n", "Epoch 4942/10000, Training Loss: 0.039415620267391205, Test Loss: 0.06634857505559921\n", "Epoch 4943/10000, Training Loss: 0.039407506585121155, Test Loss: 0.06635575741529465\n", "Epoch 4944/10000, Training Loss: 0.03939928859472275, Test Loss: 0.06636454910039902\n", "Epoch 4945/10000, Training Loss: 0.039391107857227325, Test Loss: 0.06637318432331085\n", "Epoch 4946/10000, Training Loss: 0.03938296437263489, Test Loss: 0.06638043373823166\n", "Epoch 4947/10000, Training Loss: 0.039374783635139465, Test Loss: 0.06638672202825546\n", "Epoch 4948/10000, Training Loss: 0.03936661034822464, Test Loss: 0.06639331579208374\n", "Epoch 4949/10000, Training Loss: 0.03935845196247101, Test Loss: 0.06640113890171051\n", "Epoch 4950/10000, Training Loss: 0.0393502451479435, Test Loss: 0.06640958040952682\n", "Epoch 4951/10000, Training Loss: 0.03934209421277046, Test Loss: 0.0664176270365715\n", "Epoch 4952/10000, Training Loss: 0.03933393955230713, Test Loss: 0.06642478704452515\n", "Epoch 4953/10000, Training Loss: 0.03932574763894081, Test Loss: 0.06643140316009521\n", "Epoch 4954/10000, Training Loss: 0.039317645132541656, Test Loss: 0.06643843650817871\n", "Epoch 4955/10000, Training Loss: 0.03930944576859474, Test Loss: 0.06644636392593384\n", "Epoch 4956/10000, Training Loss: 0.03930129110813141, Test Loss: 0.06645460426807404\n", "Epoch 4957/10000, Training Loss: 0.039293140172958374, Test Loss: 0.0664624348282814\n", "Epoch 4958/10000, Training Loss: 0.039284948259592056, Test Loss: 0.06646960228681564\n", "Epoch 4959/10000, Training Loss: 0.03927680850028992, Test Loss: 0.06647662818431854\n", "Epoch 4960/10000, Training Loss: 0.039268601685762405, Test Loss: 0.06648390740156174\n", "Epoch 4961/10000, Training Loss: 0.039260461926460266, Test Loss: 0.06649187207221985\n", "Epoch 4962/10000, Training Loss: 0.03925231844186783, Test Loss: 0.06649985909461975\n", "Epoch 4963/10000, Training Loss: 0.0392441526055336, Test Loss: 0.06650754809379578\n", "Epoch 4964/10000, Training Loss: 0.03923601284623146, Test Loss: 0.06651477515697479\n", "Epoch 4965/10000, Training Loss: 0.03922782465815544, Test Loss: 0.0665220320224762\n", "Epoch 4966/10000, Training Loss: 0.03921970725059509, Test Loss: 0.0665297582745552\n", "Epoch 4967/10000, Training Loss: 0.03921154513955116, Test Loss: 0.06653761863708496\n", "Epoch 4968/10000, Training Loss: 0.03920339420437813, Test Loss: 0.06654533743858337\n", "Epoch 4969/10000, Training Loss: 0.0391952320933342, Test Loss: 0.06655281782150269\n", "Epoch 4970/10000, Training Loss: 0.03918709605932236, Test Loss: 0.066560298204422\n", "Epoch 4971/10000, Training Loss: 0.03917896747589111, Test Loss: 0.06656788289546967\n", "Epoch 4972/10000, Training Loss: 0.039170823991298676, Test Loss: 0.06657560914754868\n", "Epoch 4973/10000, Training Loss: 0.039162661880254745, Test Loss: 0.0665833055973053\n", "Epoch 4974/10000, Training Loss: 0.0391545332968235, Test Loss: 0.06659098714590073\n", "Epoch 4975/10000, Training Loss: 0.03914637863636017, Test Loss: 0.06659857928752899\n", "Epoch 4976/10000, Training Loss: 0.03913826867938042, Test Loss: 0.06660637259483337\n", "Epoch 4977/10000, Training Loss: 0.03913012519478798, Test Loss: 0.06661418825387955\n", "Epoch 4978/10000, Training Loss: 0.03912196308374405, Test Loss: 0.06662195920944214\n", "Epoch 4979/10000, Training Loss: 0.03911381587386131, Test Loss: 0.06662961095571518\n", "Epoch 4980/10000, Training Loss: 0.03910567983984947, Test Loss: 0.06663729250431061\n", "Epoch 4981/10000, Training Loss: 0.039097536355257034, Test Loss: 0.0666450783610344\n", "Epoch 4982/10000, Training Loss: 0.03908941522240639, Test Loss: 0.06665276736021042\n", "Epoch 4983/10000, Training Loss: 0.039081282913684845, Test Loss: 0.06666035205125809\n", "Epoch 4984/10000, Training Loss: 0.0390731580555439, Test Loss: 0.06666820496320724\n", "Epoch 4985/10000, Training Loss: 0.03906501457095146, Test Loss: 0.06667612493038177\n", "Epoch 4986/10000, Training Loss: 0.03905687853693962, Test Loss: 0.06668391823768616\n", "Epoch 4987/10000, Training Loss: 0.03904873877763748, Test Loss: 0.0666913315653801\n", "Epoch 4988/10000, Training Loss: 0.03904065117239952, Test Loss: 0.06669900566339493\n", "Epoch 4989/10000, Training Loss: 0.03903255611658096, Test Loss: 0.0667070522904396\n", "Epoch 4990/10000, Training Loss: 0.039024386554956436, Test Loss: 0.0667152851819992\n", "Epoch 4991/10000, Training Loss: 0.03901629149913788, Test Loss: 0.06672301888465881\n", "Epoch 4992/10000, Training Loss: 0.039008140563964844, Test Loss: 0.0667305737733841\n", "Epoch 4993/10000, Training Loss: 0.03900005668401718, Test Loss: 0.06673812121152878\n", "Epoch 4994/10000, Training Loss: 0.03899190202355385, Test Loss: 0.06674613803625107\n", "Epoch 4995/10000, Training Loss: 0.03898381441831589, Test Loss: 0.06675419211387634\n", "Epoch 4996/10000, Training Loss: 0.03897567093372345, Test Loss: 0.06676221638917923\n", "Epoch 4997/10000, Training Loss: 0.03896757960319519, Test Loss: 0.06677000224590302\n", "Epoch 4998/10000, Training Loss: 0.03895946592092514, Test Loss: 0.06677751988172531\n", "Epoch 4999/10000, Training Loss: 0.0389513336122036, Test Loss: 0.06678510457277298\n", "Epoch 5000/10000, Training Loss: 0.03894323110580444, Test Loss: 0.06679319590330124\n", "Epoch 5001/10000, Training Loss: 0.03893512114882469, Test Loss: 0.06680157780647278\n", "Epoch 5002/10000, Training Loss: 0.038926951587200165, Test Loss: 0.06680969148874283\n", "Epoch 5003/10000, Training Loss: 0.03891889005899429, Test Loss: 0.06681743264198303\n", "Epoch 5004/10000, Training Loss: 0.038910768926143646, Test Loss: 0.06682494282722473\n", "Epoch 5005/10000, Training Loss: 0.038902707397937775, Test Loss: 0.06683269888162613\n", "Epoch 5006/10000, Training Loss: 0.03889456391334534, Test Loss: 0.06684082746505737\n", "Epoch 5007/10000, Training Loss: 0.038886453956365585, Test Loss: 0.06684926897287369\n", "Epoch 5008/10000, Training Loss: 0.03887838497757912, Test Loss: 0.06685739010572433\n", "Epoch 5009/10000, Training Loss: 0.038870248943567276, Test Loss: 0.06686501204967499\n", "Epoch 5010/10000, Training Loss: 0.03886215388774872, Test Loss: 0.06687244772911072\n", "Epoch 5011/10000, Training Loss: 0.03885403648018837, Test Loss: 0.06688021868467331\n", "Epoch 5012/10000, Training Loss: 0.0388459786772728, Test Loss: 0.06688889116048813\n", "Epoch 5013/10000, Training Loss: 0.03883783146739006, Test Loss: 0.0668974220752716\n", "Epoch 5014/10000, Training Loss: 0.03882976621389389, Test Loss: 0.06690521538257599\n", "Epoch 5015/10000, Training Loss: 0.03882165253162384, Test Loss: 0.06691271811723709\n", "Epoch 5016/10000, Training Loss: 0.038813553750514984, Test Loss: 0.06692017614841461\n", "Epoch 5017/10000, Training Loss: 0.03880547732114792, Test Loss: 0.06692858040332794\n", "Epoch 5018/10000, Training Loss: 0.03879740834236145, Test Loss: 0.06693731248378754\n", "Epoch 5019/10000, Training Loss: 0.038789305835962296, Test Loss: 0.06694568693637848\n", "Epoch 5020/10000, Training Loss: 0.03878118097782135, Test Loss: 0.06695307791233063\n", "Epoch 5021/10000, Training Loss: 0.038773100823163986, Test Loss: 0.06696057319641113\n", "Epoch 5022/10000, Training Loss: 0.03876502066850662, Test Loss: 0.06696869432926178\n", "Epoch 5023/10000, Training Loss: 0.03875691071152687, Test Loss: 0.06697741150856018\n", "Epoch 5024/10000, Training Loss: 0.0387488454580307, Test Loss: 0.06698582321405411\n", "Epoch 5025/10000, Training Loss: 0.03874078020453453, Test Loss: 0.06699356436729431\n", "Epoch 5026/10000, Training Loss: 0.03873267397284508, Test Loss: 0.06700132042169571\n", "Epoch 5027/10000, Training Loss: 0.038724590092897415, Test Loss: 0.06700940430164337\n", "Epoch 5028/10000, Training Loss: 0.03871650993824005, Test Loss: 0.06701777130365372\n", "Epoch 5029/10000, Training Loss: 0.038708433508872986, Test Loss: 0.06702610850334167\n", "Epoch 5030/10000, Training Loss: 0.03870036080479622, Test Loss: 0.06703402101993561\n", "Epoch 5031/10000, Training Loss: 0.03869229555130005, Test Loss: 0.06704211980104446\n", "Epoch 5032/10000, Training Loss: 0.038684189319610596, Test Loss: 0.06705024093389511\n", "Epoch 5033/10000, Training Loss: 0.03867613896727562, Test Loss: 0.06705845147371292\n", "Epoch 5034/10000, Training Loss: 0.03866806998848915, Test Loss: 0.06706688553094864\n", "Epoch 5035/10000, Training Loss: 0.03866000100970268, Test Loss: 0.06707489490509033\n", "Epoch 5036/10000, Training Loss: 0.03865190967917442, Test Loss: 0.06708298623561859\n", "Epoch 5037/10000, Training Loss: 0.03864385187625885, Test Loss: 0.06709108501672745\n", "Epoch 5038/10000, Training Loss: 0.03863578289747238, Test Loss: 0.06709937751293182\n", "Epoch 5039/10000, Training Loss: 0.03862771391868591, Test Loss: 0.06710764020681381\n", "Epoch 5040/10000, Training Loss: 0.03861963748931885, Test Loss: 0.06711579114198685\n", "Epoch 5041/10000, Training Loss: 0.03861158713698387, Test Loss: 0.06712403893470764\n", "Epoch 5042/10000, Training Loss: 0.038603488355875015, Test Loss: 0.06713223457336426\n", "Epoch 5043/10000, Training Loss: 0.03859545662999153, Test Loss: 0.0671406164765358\n", "Epoch 5044/10000, Training Loss: 0.038587380200624466, Test Loss: 0.06714887917041779\n", "Epoch 5045/10000, Training Loss: 0.038579341024160385, Test Loss: 0.06715705990791321\n", "Epoch 5046/10000, Training Loss: 0.03857126832008362, Test Loss: 0.06716509163379669\n", "Epoch 5047/10000, Training Loss: 0.03856319189071655, Test Loss: 0.06717338413000107\n", "Epoch 5048/10000, Training Loss: 0.03855515643954277, Test Loss: 0.06718192249536514\n", "Epoch 5049/10000, Training Loss: 0.03854711726307869, Test Loss: 0.06719040125608444\n", "Epoch 5050/10000, Training Loss: 0.03853902965784073, Test Loss: 0.06719846278429031\n", "Epoch 5051/10000, Training Loss: 0.038530971854925156, Test Loss: 0.06720653176307678\n", "Epoch 5052/10000, Training Loss: 0.038522910326719284, Test Loss: 0.06721492856740952\n", "Epoch 5053/10000, Training Loss: 0.038514863699674606, Test Loss: 0.06722342222929001\n", "Epoch 5054/10000, Training Loss: 0.038506798446178436, Test Loss: 0.06723194569349289\n", "Epoch 5055/10000, Training Loss: 0.038498781621456146, Test Loss: 0.06724005192518234\n", "Epoch 5056/10000, Training Loss: 0.03849071264266968, Test Loss: 0.06724803894758224\n", "Epoch 5057/10000, Training Loss: 0.038482666015625, Test Loss: 0.06725657731294632\n", "Epoch 5058/10000, Training Loss: 0.038474638015031815, Test Loss: 0.06726527959108353\n", "Epoch 5059/10000, Training Loss: 0.038466572761535645, Test Loss: 0.067273810505867\n", "Epoch 5060/10000, Training Loss: 0.03845852240920067, Test Loss: 0.06728199124336243\n", "Epoch 5061/10000, Training Loss: 0.038450516760349274, Test Loss: 0.06729000061750412\n", "Epoch 5062/10000, Training Loss: 0.038442470133304596, Test Loss: 0.06729830056428909\n", "Epoch 5063/10000, Training Loss: 0.03843442723155022, Test Loss: 0.0673069953918457\n", "Epoch 5064/10000, Training Loss: 0.03842638060450554, Test Loss: 0.06731570512056351\n", "Epoch 5065/10000, Training Loss: 0.03841831535100937, Test Loss: 0.06732407957315445\n", "Epoch 5066/10000, Training Loss: 0.038410309702157974, Test Loss: 0.06733199208974838\n", "Epoch 5067/10000, Training Loss: 0.038402263075113297, Test Loss: 0.06734021008014679\n", "Epoch 5068/10000, Training Loss: 0.038394246250391006, Test Loss: 0.06734910607337952\n", "Epoch 5069/10000, Training Loss: 0.038386180996894836, Test Loss: 0.06735799461603165\n", "Epoch 5070/10000, Training Loss: 0.03837818279862404, Test Loss: 0.06736639142036438\n", "Epoch 5071/10000, Training Loss: 0.03837011381983757, Test Loss: 0.06737449765205383\n", "Epoch 5072/10000, Training Loss: 0.038362082093954086, Test Loss: 0.0673827975988388\n", "Epoch 5073/10000, Training Loss: 0.03835407644510269, Test Loss: 0.06739139556884766\n", "Epoch 5074/10000, Training Loss: 0.03834601864218712, Test Loss: 0.0674000158905983\n", "Epoch 5075/10000, Training Loss: 0.03833800181746483, Test Loss: 0.06740858405828476\n", "Epoch 5076/10000, Training Loss: 0.03832997754216194, Test Loss: 0.06741712987422943\n", "Epoch 5077/10000, Training Loss: 0.03832194581627846, Test Loss: 0.06742535531520844\n", "Epoch 5078/10000, Training Loss: 0.03831389546394348, Test Loss: 0.06743375957012177\n", "Epoch 5079/10000, Training Loss: 0.03830590099096298, Test Loss: 0.06744256615638733\n", "Epoch 5080/10000, Training Loss: 0.0382978580892086, Test Loss: 0.06745132058858871\n", "Epoch 5081/10000, Training Loss: 0.038289859890937805, Test Loss: 0.06745973229408264\n", "Epoch 5082/10000, Training Loss: 0.03828185051679611, Test Loss: 0.0674680694937706\n", "Epoch 5083/10000, Training Loss: 0.03827383369207382, Test Loss: 0.06747660785913467\n", "Epoch 5084/10000, Training Loss: 0.03826580196619034, Test Loss: 0.06748516112565994\n", "Epoch 5085/10000, Training Loss: 0.038257818669080734, Test Loss: 0.06749393045902252\n", "Epoch 5086/10000, Training Loss: 0.03824978321790695, Test Loss: 0.06750261038541794\n", "Epoch 5087/10000, Training Loss: 0.03824176639318466, Test Loss: 0.06751125305891037\n", "Epoch 5088/10000, Training Loss: 0.03823377192020416, Test Loss: 0.06751968711614609\n", "Epoch 5089/10000, Training Loss: 0.038225747644901276, Test Loss: 0.06752824038267136\n", "Epoch 5090/10000, Training Loss: 0.038217779248952866, Test Loss: 0.0675368532538414\n", "Epoch 5091/10000, Training Loss: 0.03820974379777908, Test Loss: 0.0675455704331398\n", "Epoch 5092/10000, Training Loss: 0.038201723247766495, Test Loss: 0.06755422055721283\n", "Epoch 5093/10000, Training Loss: 0.038193684071302414, Test Loss: 0.06756290793418884\n", "Epoch 5094/10000, Training Loss: 0.03818570449948311, Test Loss: 0.0675714984536171\n", "Epoch 5095/10000, Training Loss: 0.03817768394947052, Test Loss: 0.06757993996143341\n", "Epoch 5096/10000, Training Loss: 0.03816972300410271, Test Loss: 0.06758849322795868\n", "Epoch 5097/10000, Training Loss: 0.03816169500350952, Test Loss: 0.06759744882583618\n", "Epoch 5098/10000, Training Loss: 0.038153719156980515, Test Loss: 0.06760648638010025\n", "Epoch 5099/10000, Training Loss: 0.03814569488167763, Test Loss: 0.06761519610881805\n", "Epoch 5100/10000, Training Loss: 0.038137681782245636, Test Loss: 0.06762346625328064\n", "Epoch 5101/10000, Training Loss: 0.03812973201274872, Test Loss: 0.06763199716806412\n", "Epoch 5102/10000, Training Loss: 0.03812171891331673, Test Loss: 0.06764090061187744\n", "Epoch 5103/10000, Training Loss: 0.03811372444033623, Test Loss: 0.06764987111091614\n", "Epoch 5104/10000, Training Loss: 0.038105692714452744, Test Loss: 0.06765870004892349\n", "Epoch 5105/10000, Training Loss: 0.03809773921966553, Test Loss: 0.06766709685325623\n", "Epoch 5106/10000, Training Loss: 0.038089726120233536, Test Loss: 0.06767555326223373\n", "Epoch 5107/10000, Training Loss: 0.038081757724285126, Test Loss: 0.06768453866243362\n", "Epoch 5108/10000, Training Loss: 0.03807377070188522, Test Loss: 0.06769359856843948\n", "Epoch 5109/10000, Training Loss: 0.03806576877832413, Test Loss: 0.06770255416631699\n", "Epoch 5110/10000, Training Loss: 0.03805779293179512, Test Loss: 0.0677110031247139\n", "Epoch 5111/10000, Training Loss: 0.03804983198642731, Test Loss: 0.06771968305110931\n", "Epoch 5112/10000, Training Loss: 0.038041818886995316, Test Loss: 0.0677284225821495\n", "Epoch 5113/10000, Training Loss: 0.03803384676575661, Test Loss: 0.06773748993873596\n", "Epoch 5114/10000, Training Loss: 0.03802584484219551, Test Loss: 0.06774526089429855\n", "Epoch 5115/10000, Training Loss: 0.0380178727209568, Test Loss: 0.067753367125988\n", "Epoch 5116/10000, Training Loss: 0.038009900599718094, Test Loss: 0.0677621141076088\n", "Epoch 5117/10000, Training Loss: 0.03800190985202789, Test Loss: 0.06777189671993256\n", "Epoch 5118/10000, Training Loss: 0.03799391910433769, Test Loss: 0.0677812248468399\n", "Epoch 5119/10000, Training Loss: 0.03798595815896988, Test Loss: 0.06778979301452637\n", "Epoch 5120/10000, Training Loss: 0.03797799348831177, Test Loss: 0.06779773533344269\n", "Epoch 5121/10000, Training Loss: 0.037970006465911865, Test Loss: 0.06780655682086945\n", "Epoch 5122/10000, Training Loss: 0.037962041795253754, Test Loss: 0.06781592220067978\n", "Epoch 5123/10000, Training Loss: 0.03795408084988594, Test Loss: 0.06782526522874832\n", "Epoch 5124/10000, Training Loss: 0.037946127355098724, Test Loss: 0.06783400475978851\n", "Epoch 5125/10000, Training Loss: 0.03793814778327942, Test Loss: 0.06784230470657349\n", "Epoch 5126/10000, Training Loss: 0.037930216640233994, Test Loss: 0.06785086542367935\n", "Epoch 5127/10000, Training Loss: 0.0379222109913826, Test Loss: 0.06786009669303894\n", "Epoch 5128/10000, Training Loss: 0.0379142202436924, Test Loss: 0.06786944717168808\n", "Epoch 5129/10000, Training Loss: 0.03790626674890518, Test Loss: 0.06787849962711334\n", "Epoch 5130/10000, Training Loss: 0.037898313254117966, Test Loss: 0.06788713485002518\n", "Epoch 5131/10000, Training Loss: 0.03789038211107254, Test Loss: 0.06789575517177582\n", "Epoch 5132/10000, Training Loss: 0.03788242116570473, Test Loss: 0.06790462136268616\n", "Epoch 5133/10000, Training Loss: 0.037874460220336914, Test Loss: 0.06791379302740097\n", "Epoch 5134/10000, Training Loss: 0.037866488099098206, Test Loss: 0.06792307645082474\n", "Epoch 5135/10000, Training Loss: 0.037858542054891586, Test Loss: 0.06793186813592911\n", "Epoch 5136/10000, Training Loss: 0.03785056993365288, Test Loss: 0.06794047355651855\n", "Epoch 5137/10000, Training Loss: 0.03784262388944626, Test Loss: 0.06794928014278412\n", "Epoch 5138/10000, Training Loss: 0.03783467784523964, Test Loss: 0.06795841455459595\n", "Epoch 5139/10000, Training Loss: 0.03782673925161362, Test Loss: 0.06796763092279434\n", "Epoch 5140/10000, Training Loss: 0.037818796932697296, Test Loss: 0.06797681748867035\n", "Epoch 5141/10000, Training Loss: 0.037810832262039185, Test Loss: 0.06798577308654785\n", "Epoch 5142/10000, Training Loss: 0.03780288249254227, Test Loss: 0.06799439340829849\n", "Epoch 5143/10000, Training Loss: 0.03779494762420654, Test Loss: 0.06800329685211182\n", "Epoch 5144/10000, Training Loss: 0.037786971777677536, Test Loss: 0.0680125281214714\n", "Epoch 5145/10000, Training Loss: 0.037779007107019424, Test Loss: 0.06802182644605637\n", "Epoch 5146/10000, Training Loss: 0.037771075963974, Test Loss: 0.06803087145090103\n", "Epoch 5147/10000, Training Loss: 0.03776312991976738, Test Loss: 0.06803964823484421\n", "Epoch 5148/10000, Training Loss: 0.03775523602962494, Test Loss: 0.06804844737052917\n", "Epoch 5149/10000, Training Loss: 0.03774726018309593, Test Loss: 0.06805745512247086\n", "Epoch 5150/10000, Training Loss: 0.03773936629295349, Test Loss: 0.06806697696447372\n", "Epoch 5151/10000, Training Loss: 0.037731368094682693, Test Loss: 0.06807621568441391\n", "Epoch 5152/10000, Training Loss: 0.037723466753959656, Test Loss: 0.0680851861834526\n", "Epoch 5153/10000, Training Loss: 0.037715502083301544, Test Loss: 0.06809405237436295\n", "Epoch 5154/10000, Training Loss: 0.03770759329199791, Test Loss: 0.0681031346321106\n", "Epoch 5155/10000, Training Loss: 0.03769966587424278, Test Loss: 0.06811244040727615\n", "Epoch 5156/10000, Training Loss: 0.037691738456487656, Test Loss: 0.06812159717082977\n", "Epoch 5157/10000, Training Loss: 0.03768381103873253, Test Loss: 0.06812774389982224\n", "Epoch 5158/10000, Training Loss: 0.037675850093364716, Test Loss: 0.06813493371009827\n", "Epoch 5159/10000, Training Loss: 0.037667952477931976, Test Loss: 0.06814520806074142\n", "Epoch 5160/10000, Training Loss: 0.03766000270843506, Test Loss: 0.06815686076879501\n", "Epoch 5161/10000, Training Loss: 0.037652093917131424, Test Loss: 0.06816696375608444\n", "Epoch 5162/10000, Training Loss: 0.03764420747756958, Test Loss: 0.06817423552274704\n", "Epoch 5163/10000, Training Loss: 0.03763626888394356, Test Loss: 0.06818149238824844\n", "Epoch 5164/10000, Training Loss: 0.03762838616967201, Test Loss: 0.0681910589337349\n", "Epoch 5165/10000, Training Loss: 0.0376204214990139, Test Loss: 0.06820215284824371\n", "Epoch 5166/10000, Training Loss: 0.037612542510032654, Test Loss: 0.06821219623088837\n", "Epoch 5167/10000, Training Loss: 0.037604622542858124, Test Loss: 0.06822029501199722\n", "Epoch 5168/10000, Training Loss: 0.0375966839492321, Test Loss: 0.06822779774665833\n", "Epoch 5169/10000, Training Loss: 0.03758879750967026, Test Loss: 0.06823677569627762\n", "Epoch 5170/10000, Training Loss: 0.03758085519075394, Test Loss: 0.06824760884046555\n", "Epoch 5171/10000, Training Loss: 0.0375729538500309, Test Loss: 0.06825799494981766\n", "Epoch 5172/10000, Training Loss: 0.03756508603692055, Test Loss: 0.06826652586460114\n", "Epoch 5173/10000, Training Loss: 0.037557151168584824, Test Loss: 0.06827427446842194\n", "Epoch 5174/10000, Training Loss: 0.03754926472902298, Test Loss: 0.06828323006629944\n", "Epoch 5175/10000, Training Loss: 0.037541333585977554, Test Loss: 0.06829341500997543\n", "Epoch 5176/10000, Training Loss: 0.037533439695835114, Test Loss: 0.06830360740423203\n", "Epoch 5177/10000, Training Loss: 0.03752553462982178, Test Loss: 0.06831258535385132\n", "Epoch 5178/10000, Training Loss: 0.03751768544316292, Test Loss: 0.06832075864076614\n", "Epoch 5179/10000, Training Loss: 0.037509750574827194, Test Loss: 0.06832946836948395\n", "Epoch 5180/10000, Training Loss: 0.03750188648700714, Test Loss: 0.06833957135677338\n", "Epoch 5181/10000, Training Loss: 0.037493959069252014, Test Loss: 0.06834986060857773\n", "Epoch 5182/10000, Training Loss: 0.03748607635498047, Test Loss: 0.0683588758111\n", "Epoch 5183/10000, Training Loss: 0.03747817128896713, Test Loss: 0.0683671236038208\n", "Epoch 5184/10000, Training Loss: 0.037470266222953796, Test Loss: 0.06837629526853561\n", "Epoch 5185/10000, Training Loss: 0.03746236860752106, Test Loss: 0.06838624179363251\n", "Epoch 5186/10000, Training Loss: 0.03745449706912041, Test Loss: 0.06839632987976074\n", "Epoch 5187/10000, Training Loss: 0.037446584552526474, Test Loss: 0.06840556114912033\n", "Epoch 5188/10000, Training Loss: 0.03743870183825493, Test Loss: 0.06841397285461426\n", "Epoch 5189/10000, Training Loss: 0.037430837750434875, Test Loss: 0.06842299550771713\n", "Epoch 5190/10000, Training Loss: 0.03742295503616333, Test Loss: 0.06843271851539612\n", "Epoch 5191/10000, Training Loss: 0.0374150387942791, Test Loss: 0.06844297051429749\n", "Epoch 5192/10000, Training Loss: 0.03740716353058815, Test Loss: 0.0684521496295929\n", "Epoch 5193/10000, Training Loss: 0.03739926964044571, Test Loss: 0.06846076250076294\n", "Epoch 5194/10000, Training Loss: 0.03739137202501297, Test Loss: 0.06846973299980164\n", "Epoch 5195/10000, Training Loss: 0.03738352656364441, Test Loss: 0.0684795156121254\n", "Epoch 5196/10000, Training Loss: 0.03737565502524376, Test Loss: 0.06848961859941483\n", "Epoch 5197/10000, Training Loss: 0.037367768585681915, Test Loss: 0.06849910318851471\n", "Epoch 5198/10000, Training Loss: 0.03735989332199097, Test Loss: 0.06850799918174744\n", "Epoch 5199/10000, Training Loss: 0.03735201060771942, Test Loss: 0.06851693242788315\n", "Epoch 5200/10000, Training Loss: 0.03734415024518967, Test Loss: 0.06852654367685318\n", "Epoch 5201/10000, Training Loss: 0.03733629733324051, Test Loss: 0.0685361996293068\n", "Epoch 5202/10000, Training Loss: 0.03732842579483986, Test Loss: 0.06854622811079025\n", "Epoch 5203/10000, Training Loss: 0.03732052817940712, Test Loss: 0.06855562329292297\n", "Epoch 5204/10000, Training Loss: 0.03731266409158707, Test Loss: 0.06856465339660645\n", "Epoch 5205/10000, Training Loss: 0.03730481117963791, Test Loss: 0.06857388466596603\n", "Epoch 5206/10000, Training Loss: 0.03729693591594696, Test Loss: 0.0685836523771286\n", "Epoch 5207/10000, Training Loss: 0.0372890867292881, Test Loss: 0.0685935989022255\n", "Epoch 5208/10000, Training Loss: 0.03728119656443596, Test Loss: 0.06860291212797165\n", "Epoch 5209/10000, Training Loss: 0.0372733399271965, Test Loss: 0.06861211359500885\n", "Epoch 5210/10000, Training Loss: 0.03726546838879585, Test Loss: 0.06862135231494904\n", "Epoch 5211/10000, Training Loss: 0.0372576080262661, Test Loss: 0.06863097846508026\n", "Epoch 5212/10000, Training Loss: 0.03724974766373634, Test Loss: 0.06864103674888611\n", "Epoch 5213/10000, Training Loss: 0.037241898477077484, Test Loss: 0.06865087151527405\n", "Epoch 5214/10000, Training Loss: 0.037234023213386536, Test Loss: 0.06866001337766647\n", "Epoch 5215/10000, Training Loss: 0.03722617030143738, Test Loss: 0.06866920739412308\n", "Epoch 5216/10000, Training Loss: 0.03721834346652031, Test Loss: 0.06867887824773788\n", "Epoch 5217/10000, Training Loss: 0.037210460752248764, Test Loss: 0.0686885267496109\n", "Epoch 5218/10000, Training Loss: 0.03720260411500931, Test Loss: 0.0686984658241272\n", "Epoch 5219/10000, Training Loss: 0.037194762378931046, Test Loss: 0.0687081590294838\n", "Epoch 5220/10000, Training Loss: 0.037186939269304276, Test Loss: 0.06871753185987473\n", "Epoch 5221/10000, Training Loss: 0.03717907890677452, Test Loss: 0.06872666627168655\n", "Epoch 5222/10000, Training Loss: 0.037171248346567154, Test Loss: 0.0687364861369133\n", "Epoch 5223/10000, Training Loss: 0.037163399159908295, Test Loss: 0.0687466487288475\n", "Epoch 5224/10000, Training Loss: 0.03715553507208824, Test Loss: 0.06875640153884888\n", "Epoch 5225/10000, Training Loss: 0.03714769706130028, Test Loss: 0.06876551359891891\n", "Epoch 5226/10000, Training Loss: 0.03713984042406082, Test Loss: 0.06877496093511581\n", "Epoch 5227/10000, Training Loss: 0.03713201731443405, Test Loss: 0.0687849298119545\n", "Epoch 5228/10000, Training Loss: 0.0371241569519043, Test Loss: 0.06879505515098572\n", "Epoch 5229/10000, Training Loss: 0.03711634501814842, Test Loss: 0.0688047781586647\n", "Epoch 5230/10000, Training Loss: 0.03710847720503807, Test Loss: 0.06881381571292877\n", "Epoch 5231/10000, Training Loss: 0.037100668996572495, Test Loss: 0.06882309913635254\n", "Epoch 5232/10000, Training Loss: 0.037092842161655426, Test Loss: 0.0688331127166748\n", "Epoch 5233/10000, Training Loss: 0.03708497807383537, Test Loss: 0.06884323060512543\n", "Epoch 5234/10000, Training Loss: 0.0370771549642086, Test Loss: 0.06885313242673874\n", "Epoch 5235/10000, Training Loss: 0.03706927224993706, Test Loss: 0.06886252760887146\n", "Epoch 5236/10000, Training Loss: 0.037061478942632675, Test Loss: 0.06887201964855194\n", "Epoch 5237/10000, Training Loss: 0.03705363720655441, Test Loss: 0.06888189166784286\n", "Epoch 5238/10000, Training Loss: 0.03704582527279854, Test Loss: 0.06889189779758453\n", "Epoch 5239/10000, Training Loss: 0.03703797608613968, Test Loss: 0.06890177726745605\n", "Epoch 5240/10000, Training Loss: 0.03703015297651291, Test Loss: 0.06891127675771713\n", "Epoch 5241/10000, Training Loss: 0.03702237829566002, Test Loss: 0.06892083585262299\n", "Epoch 5242/10000, Training Loss: 0.03701452538371086, Test Loss: 0.06893064826726913\n", "Epoch 5243/10000, Training Loss: 0.03700670227408409, Test Loss: 0.06894104927778244\n", "Epoch 5244/10000, Training Loss: 0.03699885308742523, Test Loss: 0.06895110756158829\n", "Epoch 5245/10000, Training Loss: 0.03699106723070145, Test Loss: 0.06896070390939713\n", "Epoch 5246/10000, Training Loss: 0.036983225494623184, Test Loss: 0.06897034496068954\n", "Epoch 5247/10000, Training Loss: 0.036975421011447906, Test Loss: 0.06898048520088196\n", "Epoch 5248/10000, Training Loss: 0.03696759045124054, Test Loss: 0.0689905509352684\n", "Epoch 5249/10000, Training Loss: 0.036959804594516754, Test Loss: 0.06900040060281754\n", "Epoch 5250/10000, Training Loss: 0.036951944231987, Test Loss: 0.0690099447965622\n", "Epoch 5251/10000, Training Loss: 0.03694414719939232, Test Loss: 0.06901968270540237\n", "Epoch 5252/10000, Training Loss: 0.036936331540346146, Test Loss: 0.0690295472741127\n", "Epoch 5253/10000, Training Loss: 0.03692852705717087, Test Loss: 0.06903965026140213\n", "Epoch 5254/10000, Training Loss: 0.03692074120044708, Test Loss: 0.06904983520507812\n", "Epoch 5255/10000, Training Loss: 0.036912888288497925, Test Loss: 0.06905991584062576\n", "Epoch 5256/10000, Training Loss: 0.036905113607645035, Test Loss: 0.0690692737698555\n", "Epoch 5257/10000, Training Loss: 0.03689728304743767, Test Loss: 0.06907874345779419\n", "Epoch 5258/10000, Training Loss: 0.036889493465423584, Test Loss: 0.06908886134624481\n", "Epoch 5259/10000, Training Loss: 0.03688165545463562, Test Loss: 0.06909959763288498\n", "Epoch 5260/10000, Training Loss: 0.03687387332320213, Test Loss: 0.06910964846611023\n", "Epoch 5261/10000, Training Loss: 0.03686607629060745, Test Loss: 0.06911879777908325\n", "Epoch 5262/10000, Training Loss: 0.036858245730400085, Test Loss: 0.06912830471992493\n", "Epoch 5263/10000, Training Loss: 0.03685048595070839, Test Loss: 0.06913870573043823\n", "Epoch 5264/10000, Training Loss: 0.03684266656637192, Test Loss: 0.06914978474378586\n", "Epoch 5265/10000, Training Loss: 0.036834876984357834, Test Loss: 0.06916005164384842\n", "Epoch 5266/10000, Training Loss: 0.036827098578214645, Test Loss: 0.06916951388120651\n", "Epoch 5267/10000, Training Loss: 0.03681929409503937, Test Loss: 0.0691789761185646\n", "Epoch 5268/10000, Training Loss: 0.03681148216128349, Test Loss: 0.06918901950120926\n", "Epoch 5269/10000, Training Loss: 0.03680365905165672, Test Loss: 0.06919972598552704\n", "Epoch 5270/10000, Training Loss: 0.036795906722545624, Test Loss: 0.0692097395658493\n", "Epoch 5271/10000, Training Loss: 0.03678814694285393, Test Loss: 0.06921909004449844\n", "Epoch 5272/10000, Training Loss: 0.03678031265735626, Test Loss: 0.06922873854637146\n", "Epoch 5273/10000, Training Loss: 0.03677254542708397, Test Loss: 0.06923940032720566\n", "Epoch 5274/10000, Training Loss: 0.03676476329565048, Test Loss: 0.06925021857023239\n", "Epoch 5275/10000, Training Loss: 0.0367569737136364, Test Loss: 0.06926019489765167\n", "Epoch 5276/10000, Training Loss: 0.03674919903278351, Test Loss: 0.06926959753036499\n", "Epoch 5277/10000, Training Loss: 0.036741405725479126, Test Loss: 0.06927970796823502\n", "Epoch 5278/10000, Training Loss: 0.03673361614346504, Test Loss: 0.06929043680429459\n", "Epoch 5279/10000, Training Loss: 0.03672584891319275, Test Loss: 0.06930090487003326\n", "Epoch 5280/10000, Training Loss: 0.03671804443001747, Test Loss: 0.06931044906377792\n", "Epoch 5281/10000, Training Loss: 0.03671027347445488, Test Loss: 0.06931983679533005\n", "Epoch 5282/10000, Training Loss: 0.036702487617731094, Test Loss: 0.069330133497715\n", "Epoch 5283/10000, Training Loss: 0.036694712936878204, Test Loss: 0.06934118270874023\n", "Epoch 5284/10000, Training Loss: 0.03668689727783203, Test Loss: 0.06935175508260727\n", "Epoch 5285/10000, Training Loss: 0.03667917847633362, Test Loss: 0.06936123222112656\n", "Epoch 5286/10000, Training Loss: 0.03667142987251282, Test Loss: 0.06937119364738464\n", "Epoch 5287/10000, Training Loss: 0.03666364774107933, Test Loss: 0.06938164681196213\n", "Epoch 5288/10000, Training Loss: 0.036655861884355545, Test Loss: 0.06939227133989334\n", "Epoch 5289/10000, Training Loss: 0.03664808347821236, Test Loss: 0.0694020688533783\n", "Epoch 5290/10000, Training Loss: 0.03664029762148857, Test Loss: 0.06941162049770355\n", "Epoch 5291/10000, Training Loss: 0.036632515490055084, Test Loss: 0.06942222267389297\n", "Epoch 5292/10000, Training Loss: 0.03662480413913727, Test Loss: 0.06943351775407791\n", "Epoch 5293/10000, Training Loss: 0.03661702573299408, Test Loss: 0.0694437250494957\n", "Epoch 5294/10000, Training Loss: 0.0366092287003994, Test Loss: 0.06945344060659409\n", "Epoch 5295/10000, Training Loss: 0.03660149872303009, Test Loss: 0.06946288794279099\n", "Epoch 5296/10000, Training Loss: 0.036593757569789886, Test Loss: 0.06947313249111176\n", "Epoch 5297/10000, Training Loss: 0.036585986614227295, Test Loss: 0.0694844201207161\n", "Epoch 5298/10000, Training Loss: 0.0365782305598259, Test Loss: 0.0694953203201294\n", "Epoch 5299/10000, Training Loss: 0.036570463329553604, Test Loss: 0.06950493156909943\n", "Epoch 5300/10000, Training Loss: 0.03656267374753952, Test Loss: 0.06951435655355453\n", "Epoch 5301/10000, Training Loss: 0.0365549735724926, Test Loss: 0.06952479481697083\n", "Epoch 5302/10000, Training Loss: 0.0365472137928009, Test Loss: 0.06953603029251099\n", "Epoch 5303/10000, Training Loss: 0.03653942793607712, Test Loss: 0.06954704970121384\n", "Epoch 5304/10000, Training Loss: 0.036531709134578705, Test Loss: 0.06955710053443909\n", "Epoch 5305/10000, Training Loss: 0.036523934453725815, Test Loss: 0.06956549733877182\n", "Epoch 5306/10000, Training Loss: 0.03651617467403412, Test Loss: 0.06957481801509857\n", "Epoch 5307/10000, Training Loss: 0.036508429795503616, Test Loss: 0.06958644092082977\n", "Epoch 5308/10000, Training Loss: 0.036500681191682816, Test Loss: 0.0695982277393341\n", "Epoch 5309/10000, Training Loss: 0.03649292141199112, Test Loss: 0.0696086511015892\n", "Epoch 5310/10000, Training Loss: 0.03648517280817032, Test Loss: 0.06961753219366074\n", "Epoch 5311/10000, Training Loss: 0.0364774689078331, Test Loss: 0.06962765753269196\n", "Epoch 5312/10000, Training Loss: 0.0364697128534317, Test Loss: 0.06963901221752167\n", "Epoch 5313/10000, Training Loss: 0.0364619717001915, Test Loss: 0.06965048611164093\n", "Epoch 5314/10000, Training Loss: 0.036454275250434875, Test Loss: 0.06965998560190201\n", "Epoch 5315/10000, Training Loss: 0.03644650802016258, Test Loss: 0.06966914236545563\n", "Epoch 5316/10000, Training Loss: 0.036438778042793274, Test Loss: 0.0696793645620346\n", "Epoch 5317/10000, Training Loss: 0.03643106296658516, Test Loss: 0.06969086825847626\n", "Epoch 5318/10000, Training Loss: 0.036423325538635254, Test Loss: 0.06970208138227463\n", "Epoch 5319/10000, Training Loss: 0.03641556575894356, Test Loss: 0.0697123184800148\n", "Epoch 5320/10000, Training Loss: 0.03640783205628395, Test Loss: 0.06972215324640274\n", "Epoch 5321/10000, Training Loss: 0.03640009090304375, Test Loss: 0.0697321966290474\n", "Epoch 5322/10000, Training Loss: 0.03639238327741623, Test Loss: 0.06974287331104279\n", "Epoch 5323/10000, Training Loss: 0.03638463839888573, Test Loss: 0.06975386291742325\n", "Epoch 5324/10000, Training Loss: 0.03637692332267761, Test Loss: 0.06976479291915894\n", "Epoch 5325/10000, Training Loss: 0.03636925667524338, Test Loss: 0.06977491825819016\n", "Epoch 5326/10000, Training Loss: 0.036361485719680786, Test Loss: 0.06978446990251541\n", "Epoch 5327/10000, Training Loss: 0.036353737115859985, Test Loss: 0.06979469954967499\n", "Epoch 5328/10000, Training Loss: 0.036346059292554855, Test Loss: 0.06980616599321365\n", "Epoch 5329/10000, Training Loss: 0.03633834049105644, Test Loss: 0.06981752067804337\n", "Epoch 5330/10000, Training Loss: 0.03633059561252594, Test Loss: 0.06982798129320145\n", "Epoch 5331/10000, Training Loss: 0.03632289171218872, Test Loss: 0.06983759254217148\n", "Epoch 5332/10000, Training Loss: 0.03631516173481941, Test Loss: 0.06984754651784897\n", "Epoch 5333/10000, Training Loss: 0.03630741313099861, Test Loss: 0.0698583647608757\n", "Epoch 5334/10000, Training Loss: 0.03629973903298378, Test Loss: 0.06986961513757706\n", "Epoch 5335/10000, Training Loss: 0.03629204258322716, Test Loss: 0.06988029181957245\n", "Epoch 5336/10000, Training Loss: 0.036284301429986954, Test Loss: 0.06989084184169769\n", "Epoch 5337/10000, Training Loss: 0.03627661243081093, Test Loss: 0.06990139931440353\n", "Epoch 5338/10000, Training Loss: 0.036268915981054306, Test Loss: 0.06991206109523773\n", "Epoch 5339/10000, Training Loss: 0.036261171102523804, Test Loss: 0.06992220133543015\n", "Epoch 5340/10000, Training Loss: 0.036253444850444794, Test Loss: 0.06993240118026733\n", "Epoch 5341/10000, Training Loss: 0.03624573349952698, Test Loss: 0.06994341313838959\n", "Epoch 5342/10000, Training Loss: 0.03623806685209274, Test Loss: 0.06995448470115662\n", "Epoch 5343/10000, Training Loss: 0.03623037412762642, Test Loss: 0.06996525079011917\n", "Epoch 5344/10000, Training Loss: 0.03622264415025711, Test Loss: 0.06997571140527725\n", "Epoch 5345/10000, Training Loss: 0.03621494397521019, Test Loss: 0.06998606771230698\n", "Epoch 5346/10000, Training Loss: 0.03620721399784088, Test Loss: 0.06999719142913818\n", "Epoch 5347/10000, Training Loss: 0.036199573427438736, Test Loss: 0.0700083002448082\n", "Epoch 5348/10000, Training Loss: 0.03619183227419853, Test Loss: 0.0700187161564827\n", "Epoch 5349/10000, Training Loss: 0.036184147000312805, Test Loss: 0.07002897560596466\n", "Epoch 5350/10000, Training Loss: 0.036176491528749466, Test Loss: 0.07003921270370483\n", "Epoch 5351/10000, Training Loss: 0.03616878762841225, Test Loss: 0.07005046308040619\n", "Epoch 5352/10000, Training Loss: 0.036161068826913834, Test Loss: 0.07006178796291351\n", "Epoch 5353/10000, Training Loss: 0.0361533984541893, Test Loss: 0.0700724720954895\n", "Epoch 5354/10000, Training Loss: 0.036145709455013275, Test Loss: 0.07008222490549088\n", "Epoch 5355/10000, Training Loss: 0.036137983202934265, Test Loss: 0.07009293138980865\n", "Epoch 5356/10000, Training Loss: 0.036130331456661224, Test Loss: 0.07010425627231598\n", "Epoch 5357/10000, Training Loss: 0.03612259775400162, Test Loss: 0.07011563330888748\n", "Epoch 5358/10000, Training Loss: 0.036114949733018875, Test Loss: 0.07012657821178436\n", "Epoch 5359/10000, Training Loss: 0.03610726818442345, Test Loss: 0.07013675570487976\n", "Epoch 5360/10000, Training Loss: 0.036099597811698914, Test Loss: 0.07014714926481247\n", "Epoch 5361/10000, Training Loss: 0.036091893911361694, Test Loss: 0.07015812397003174\n", "Epoch 5362/10000, Training Loss: 0.036084238439798355, Test Loss: 0.07016950845718384\n", "Epoch 5363/10000, Training Loss: 0.03607653081417084, Test Loss: 0.07018039375543594\n", "Epoch 5364/10000, Training Loss: 0.036068860441446304, Test Loss: 0.07019074261188507\n", "Epoch 5365/10000, Training Loss: 0.036061204969882965, Test Loss: 0.07020147889852524\n", "Epoch 5366/10000, Training Loss: 0.03605351969599724, Test Loss: 0.07021239399909973\n", "Epoch 5367/10000, Training Loss: 0.03604579716920853, Test Loss: 0.07022374123334885\n", "Epoch 5368/10000, Training Loss: 0.03603815287351608, Test Loss: 0.07023482024669647\n", "Epoch 5369/10000, Training Loss: 0.03603045269846916, Test Loss: 0.07024533301591873\n", "Epoch 5370/10000, Training Loss: 0.036022815853357315, Test Loss: 0.07025580108165741\n", "Epoch 5371/10000, Training Loss: 0.03601513057947159, Test Loss: 0.07026734948158264\n", "Epoch 5372/10000, Training Loss: 0.036007463932037354, Test Loss: 0.07027827948331833\n", "Epoch 5373/10000, Training Loss: 0.03599980100989342, Test Loss: 0.0702890157699585\n", "Epoch 5374/10000, Training Loss: 0.035992126911878586, Test Loss: 0.07029947638511658\n", "Epoch 5375/10000, Training Loss: 0.03598446026444435, Test Loss: 0.07031074911355972\n", "Epoch 5376/10000, Training Loss: 0.03597680479288101, Test Loss: 0.07032211869955063\n", "Epoch 5377/10000, Training Loss: 0.03596913442015648, Test Loss: 0.07033315300941467\n", "Epoch 5378/10000, Training Loss: 0.03596147522330284, Test Loss: 0.07034407556056976\n", "Epoch 5379/10000, Training Loss: 0.03595380857586861, Test Loss: 0.07035491615533829\n", "Epoch 5380/10000, Training Loss: 0.03594617545604706, Test Loss: 0.0703657940030098\n", "Epoch 5381/10000, Training Loss: 0.03593849390745163, Test Loss: 0.07037653774023056\n", "Epoch 5382/10000, Training Loss: 0.03593084216117859, Test Loss: 0.07038756459951401\n", "Epoch 5383/10000, Training Loss: 0.03592318296432495, Test Loss: 0.07039867341518402\n", "Epoch 5384/10000, Training Loss: 0.03591551631689072, Test Loss: 0.07041005045175552\n", "Epoch 5385/10000, Training Loss: 0.035907868295907974, Test Loss: 0.07042119652032852\n", "Epoch 5386/10000, Training Loss: 0.03590022400021553, Test Loss: 0.07043199986219406\n", "Epoch 5387/10000, Training Loss: 0.035892635583877563, Test Loss: 0.0704428106546402\n", "Epoch 5388/10000, Training Loss: 0.035884931683540344, Test Loss: 0.07045444846153259\n", "Epoch 5389/10000, Training Loss: 0.0358772836625576, Test Loss: 0.07046592980623245\n", "Epoch 5390/10000, Training Loss: 0.03586963191628456, Test Loss: 0.07047618925571442\n", "Epoch 5391/10000, Training Loss: 0.03586197644472122, Test Loss: 0.07048644125461578\n", "Epoch 5392/10000, Training Loss: 0.03585434705018997, Test Loss: 0.07049836218357086\n", "Epoch 5393/10000, Training Loss: 0.03584669157862663, Test Loss: 0.07051032036542892\n", "Epoch 5394/10000, Training Loss: 0.03583906218409538, Test Loss: 0.07052111625671387\n", "Epoch 5395/10000, Training Loss: 0.03583144024014473, Test Loss: 0.07053149491548538\n", "Epoch 5396/10000, Training Loss: 0.03582379221916199, Test Loss: 0.07054296135902405\n", "Epoch 5397/10000, Training Loss: 0.03581613674759865, Test Loss: 0.07055482268333435\n", "Epoch 5398/10000, Training Loss: 0.03580847382545471, Test Loss: 0.07056517899036407\n", "Epoch 5399/10000, Training Loss: 0.03580087050795555, Test Loss: 0.0705750584602356\n", "Epoch 5400/10000, Training Loss: 0.035793207585811615, Test Loss: 0.07058650255203247\n", "Epoch 5401/10000, Training Loss: 0.035785600543022156, Test Loss: 0.07059884816408157\n", "Epoch 5402/10000, Training Loss: 0.035777948796749115, Test Loss: 0.07061059027910233\n", "Epoch 5403/10000, Training Loss: 0.035770341753959656, Test Loss: 0.0706210508942604\n", "Epoch 5404/10000, Training Loss: 0.035762686282396317, Test Loss: 0.07063163071870804\n", "Epoch 5405/10000, Training Loss: 0.03575509041547775, Test Loss: 0.0706433430314064\n", "Epoch 5406/10000, Training Loss: 0.0357474610209465, Test Loss: 0.07065502554178238\n", "Epoch 5407/10000, Training Loss: 0.03573984652757645, Test Loss: 0.07066568732261658\n", "Epoch 5408/10000, Training Loss: 0.0357322134077549, Test Loss: 0.07067608833312988\n", "Epoch 5409/10000, Training Loss: 0.035724565386772156, Test Loss: 0.0706879124045372\n", "Epoch 5410/10000, Training Loss: 0.0357169546186924, Test Loss: 0.0707000195980072\n", "Epoch 5411/10000, Training Loss: 0.035709358751773834, Test Loss: 0.07071126997470856\n", "Epoch 5412/10000, Training Loss: 0.03570173308253288, Test Loss: 0.07072168588638306\n", "Epoch 5413/10000, Training Loss: 0.03569410368800163, Test Loss: 0.07073243707418442\n", "Epoch 5414/10000, Training Loss: 0.03568645566701889, Test Loss: 0.0707446038722992\n", "Epoch 5415/10000, Training Loss: 0.03567885980010033, Test Loss: 0.07075672596693039\n", "Epoch 5416/10000, Training Loss: 0.03567121550440788, Test Loss: 0.07076779752969742\n", "Epoch 5417/10000, Training Loss: 0.0356636606156826, Test Loss: 0.07077787071466446\n", "Epoch 5418/10000, Training Loss: 0.03565601631999016, Test Loss: 0.07078894227743149\n", "Epoch 5419/10000, Training Loss: 0.035648420453071594, Test Loss: 0.07080121338367462\n", "Epoch 5420/10000, Training Loss: 0.03564082831144333, Test Loss: 0.07081303000450134\n", "Epoch 5421/10000, Training Loss: 0.03563322871923447, Test Loss: 0.07082361727952957\n", "Epoch 5422/10000, Training Loss: 0.03562558814883232, Test Loss: 0.07083431631326675\n", "Epoch 5423/10000, Training Loss: 0.03561798855662346, Test Loss: 0.07084619253873825\n", "Epoch 5424/10000, Training Loss: 0.03561040386557579, Test Loss: 0.07085873186588287\n", "Epoch 5425/10000, Training Loss: 0.03560280054807663, Test Loss: 0.07087022066116333\n", "Epoch 5426/10000, Training Loss: 0.035595204681158066, Test Loss: 0.07088042050600052\n", "Epoch 5427/10000, Training Loss: 0.035587623715400696, Test Loss: 0.07089073956012726\n", "Epoch 5428/10000, Training Loss: 0.035579994320869446, Test Loss: 0.07090288400650024\n", "Epoch 5429/10000, Training Loss: 0.03557237982749939, Test Loss: 0.07091624289751053\n", "Epoch 5430/10000, Training Loss: 0.03556475788354874, Test Loss: 0.07092706114053726\n", "Epoch 5431/10000, Training Loss: 0.03555719181895256, Test Loss: 0.07093704491853714\n", "Epoch 5432/10000, Training Loss: 0.03554956242442131, Test Loss: 0.07094849646091461\n", "Epoch 5433/10000, Training Loss: 0.03554202988743782, Test Loss: 0.07096104323863983\n", "Epoch 5434/10000, Training Loss: 0.03553443402051926, Test Loss: 0.07097265124320984\n", "Epoch 5435/10000, Training Loss: 0.0355268269777298, Test Loss: 0.07098332047462463\n", "Epoch 5436/10000, Training Loss: 0.03551924601197243, Test Loss: 0.0709945484995842\n", "Epoch 5437/10000, Training Loss: 0.03551167994737625, Test Loss: 0.07100657373666763\n", "Epoch 5438/10000, Training Loss: 0.035504113882780075, Test Loss: 0.07101867347955704\n", "Epoch 5439/10000, Training Loss: 0.03549647703766823, Test Loss: 0.0710301622748375\n", "Epoch 5440/10000, Training Loss: 0.03548891097307205, Test Loss: 0.07104078680276871\n", "Epoch 5441/10000, Training Loss: 0.03548126667737961, Test Loss: 0.07105214893817902\n", "Epoch 5442/10000, Training Loss: 0.03547372668981552, Test Loss: 0.0710640624165535\n", "Epoch 5443/10000, Training Loss: 0.03546615689992905, Test Loss: 0.07107631117105484\n", "Epoch 5444/10000, Training Loss: 0.03545856103301048, Test Loss: 0.07108791172504425\n", "Epoch 5445/10000, Training Loss: 0.03545099124312401, Test Loss: 0.07109846919775009\n", "Epoch 5446/10000, Training Loss: 0.035443421453237534, Test Loss: 0.07110976427793503\n", "Epoch 5447/10000, Training Loss: 0.03543582186102867, Test Loss: 0.07112233340740204\n", "Epoch 5448/10000, Training Loss: 0.035428278148174286, Test Loss: 0.07113444060087204\n", "Epoch 5449/10000, Training Loss: 0.03542066738009453, Test Loss: 0.07114534825086594\n", "Epoch 5450/10000, Training Loss: 0.035413119941949844, Test Loss: 0.07115641236305237\n", "Epoch 5451/10000, Training Loss: 0.035405538976192474, Test Loss: 0.0711684301495552\n", "Epoch 5452/10000, Training Loss: 0.03539792820811272, Test Loss: 0.07118067145347595\n", "Epoch 5453/10000, Training Loss: 0.03539043292403221, Test Loss: 0.07119192183017731\n", "Epoch 5454/10000, Training Loss: 0.03538283705711365, Test Loss: 0.07120323181152344\n", "Epoch 5455/10000, Training Loss: 0.03537527099251747, Test Loss: 0.07121496647596359\n", "Epoch 5456/10000, Training Loss: 0.035367708653211594, Test Loss: 0.0712270513176918\n", "Epoch 5457/10000, Training Loss: 0.03536013141274452, Test Loss: 0.07123906165361404\n", "Epoch 5458/10000, Training Loss: 0.03535258024930954, Test Loss: 0.07125032693147659\n", "Epoch 5459/10000, Training Loss: 0.03534502163529396, Test Loss: 0.0712611973285675\n", "Epoch 5460/10000, Training Loss: 0.03533746302127838, Test Loss: 0.07127305865287781\n", "Epoch 5461/10000, Training Loss: 0.0353299118578434, Test Loss: 0.07128573954105377\n", "Epoch 5462/10000, Training Loss: 0.03532234579324722, Test Loss: 0.07129762321710587\n", "Epoch 5463/10000, Training Loss: 0.03531476482748985, Test Loss: 0.07130870968103409\n", "Epoch 5464/10000, Training Loss: 0.03530725836753845, Test Loss: 0.07131996750831604\n", "Epoch 5465/10000, Training Loss: 0.03529971092939377, Test Loss: 0.07133212685585022\n", "Epoch 5466/10000, Training Loss: 0.0352921336889267, Test Loss: 0.07134444266557693\n", "Epoch 5467/10000, Training Loss: 0.0352846123278141, Test Loss: 0.07135619968175888\n", "Epoch 5468/10000, Training Loss: 0.03527701273560524, Test Loss: 0.07136721909046173\n", "Epoch 5469/10000, Training Loss: 0.03526948019862175, Test Loss: 0.07137885689735413\n", "Epoch 5470/10000, Training Loss: 0.03526192158460617, Test Loss: 0.07139119505882263\n", "Epoch 5471/10000, Training Loss: 0.03525441512465477, Test Loss: 0.071403369307518\n", "Epoch 5472/10000, Training Loss: 0.03524686023592949, Test Loss: 0.07141467928886414\n", "Epoch 5473/10000, Training Loss: 0.0352393239736557, Test Loss: 0.07142572849988937\n", "Epoch 5474/10000, Training Loss: 0.03523177281022072, Test Loss: 0.07143794000148773\n", "Epoch 5475/10000, Training Loss: 0.035224251449108124, Test Loss: 0.07145077735185623\n", "Epoch 5476/10000, Training Loss: 0.035216644406318665, Test Loss: 0.07146311551332474\n", "Epoch 5477/10000, Training Loss: 0.03520912304520607, Test Loss: 0.0714741051197052\n", "Epoch 5478/10000, Training Loss: 0.03520162031054497, Test Loss: 0.07148504257202148\n", "Epoch 5479/10000, Training Loss: 0.03519408404827118, Test Loss: 0.0714971125125885\n", "Epoch 5480/10000, Training Loss: 0.03518654406070709, Test Loss: 0.07150999456644058\n", "Epoch 5481/10000, Training Loss: 0.0351790189743042, Test Loss: 0.07152235507965088\n", "Epoch 5482/10000, Training Loss: 0.035171497613191605, Test Loss: 0.07153331488370895\n", "Epoch 5483/10000, Training Loss: 0.03516397625207901, Test Loss: 0.07154446840286255\n", "Epoch 5484/10000, Training Loss: 0.035156432539224625, Test Loss: 0.07155713438987732\n", "Epoch 5485/10000, Training Loss: 0.035148896276950836, Test Loss: 0.07157020270824432\n", "Epoch 5486/10000, Training Loss: 0.035141389816999435, Test Loss: 0.07158149033784866\n", "Epoch 5487/10000, Training Loss: 0.03513387218117714, Test Loss: 0.07159216701984406\n", "Epoch 5488/10000, Training Loss: 0.03512630611658096, Test Loss: 0.0716039389371872\n", "Epoch 5489/10000, Training Loss: 0.03511882200837135, Test Loss: 0.0716179683804512\n", "Epoch 5490/10000, Training Loss: 0.035111263394355774, Test Loss: 0.07163067162036896\n", "Epoch 5491/10000, Training Loss: 0.03510376065969467, Test Loss: 0.07164077460765839\n", "Epoch 5492/10000, Training Loss: 0.03509625047445297, Test Loss: 0.07165120542049408\n", "Epoch 5493/10000, Training Loss: 0.03508872911334038, Test Loss: 0.07166397571563721\n", "Epoch 5494/10000, Training Loss: 0.035081230103969574, Test Loss: 0.07167785614728928\n", "Epoch 5495/10000, Training Loss: 0.03507369011640549, Test Loss: 0.07168978452682495\n", "Epoch 5496/10000, Training Loss: 0.03506620228290558, Test Loss: 0.07170021533966064\n", "Epoch 5497/10000, Training Loss: 0.03505866974592209, Test Loss: 0.07171181589365005\n", "Epoch 5498/10000, Training Loss: 0.03505118563771248, Test Loss: 0.07172563672065735\n", "Epoch 5499/10000, Training Loss: 0.03504366800189018, Test Loss: 0.07173878699541092\n", "Epoch 5500/10000, Training Loss: 0.03503614291548729, Test Loss: 0.071749247610569\n", "Epoch 5501/10000, Training Loss: 0.03502865880727768, Test Loss: 0.07175972312688828\n", "Epoch 5502/10000, Training Loss: 0.03502115234732628, Test Loss: 0.07177272439002991\n", "Epoch 5503/10000, Training Loss: 0.035013597458601, Test Loss: 0.07178620994091034\n", "Epoch 5504/10000, Training Loss: 0.03500613570213318, Test Loss: 0.07179815322160721\n", "Epoch 5505/10000, Training Loss: 0.03499862551689148, Test Loss: 0.07180885970592499\n", "Epoch 5506/10000, Training Loss: 0.034991126507520676, Test Loss: 0.07182107865810394\n", "Epoch 5507/10000, Training Loss: 0.03498364984989166, Test Loss: 0.07183445990085602\n", "Epoch 5508/10000, Training Loss: 0.03497614711523056, Test Loss: 0.07184681296348572\n", "Epoch 5509/10000, Training Loss: 0.03496861830353737, Test Loss: 0.07185807079076767\n", "Epoch 5510/10000, Training Loss: 0.03496113792061806, Test Loss: 0.07186952978372574\n", "Epoch 5511/10000, Training Loss: 0.03495361655950546, Test Loss: 0.07188242673873901\n", "Epoch 5512/10000, Training Loss: 0.03494615480303764, Test Loss: 0.07189536094665527\n", "Epoch 5513/10000, Training Loss: 0.03493864834308624, Test Loss: 0.07190708070993423\n", "Epoch 5514/10000, Training Loss: 0.03493119403719902, Test Loss: 0.07191824913024902\n", "Epoch 5515/10000, Training Loss: 0.03492371365427971, Test Loss: 0.071930892765522\n", "Epoch 5516/10000, Training Loss: 0.03491619974374771, Test Loss: 0.07194430381059647\n", "Epoch 5517/10000, Training Loss: 0.034908730536699295, Test Loss: 0.07195641845464706\n", "Epoch 5518/10000, Training Loss: 0.034901224076747894, Test Loss: 0.07196753472089767\n", "Epoch 5519/10000, Training Loss: 0.034893766045570374, Test Loss: 0.0719795972108841\n", "Epoch 5520/10000, Training Loss: 0.03488626331090927, Test Loss: 0.07199246436357498\n", "Epoch 5521/10000, Training Loss: 0.03487878665328026, Test Loss: 0.07200507074594498\n", "Epoch 5522/10000, Training Loss: 0.034871309995651245, Test Loss: 0.07201690971851349\n", "Epoch 5523/10000, Training Loss: 0.034863799810409546, Test Loss: 0.07202884554862976\n", "Epoch 5524/10000, Training Loss: 0.034856341779232025, Test Loss: 0.07204122096300125\n", "Epoch 5525/10000, Training Loss: 0.03484886512160301, Test Loss: 0.07205457985401154\n", "Epoch 5526/10000, Training Loss: 0.034841395914554596, Test Loss: 0.07206705957651138\n", "Epoch 5527/10000, Training Loss: 0.03483390063047409, Test Loss: 0.07207825779914856\n", "Epoch 5528/10000, Training Loss: 0.03482644632458687, Test Loss: 0.07209000736474991\n", "Epoch 5529/10000, Training Loss: 0.034818992018699646, Test Loss: 0.07210294902324677\n", "Epoch 5530/10000, Training Loss: 0.03481151536107063, Test Loss: 0.07211604714393616\n", "Epoch 5531/10000, Training Loss: 0.03480404615402222, Test Loss: 0.07212802022695541\n", "Epoch 5532/10000, Training Loss: 0.03479660302400589, Test Loss: 0.07213983684778214\n", "Epoch 5533/10000, Training Loss: 0.034789107739925385, Test Loss: 0.07215236127376556\n", "Epoch 5534/10000, Training Loss: 0.03478162735700607, Test Loss: 0.07216513901948929\n", "Epoch 5535/10000, Training Loss: 0.034774187952280045, Test Loss: 0.07217791676521301\n", "Epoch 5536/10000, Training Loss: 0.034766681492328644, Test Loss: 0.07218998670578003\n", "Epoch 5537/10000, Training Loss: 0.034759268164634705, Test Loss: 0.07220184057950974\n", "Epoch 5538/10000, Training Loss: 0.03475181385874748, Test Loss: 0.07221437990665436\n", "Epoch 5539/10000, Training Loss: 0.03474435210227966, Test Loss: 0.07222757488489151\n", "Epoch 5540/10000, Training Loss: 0.03473687916994095, Test Loss: 0.07223968207836151\n", "Epoch 5541/10000, Training Loss: 0.034729424864053726, Test Loss: 0.07225129008293152\n", "Epoch 5542/10000, Training Loss: 0.0347219742834568, Test Loss: 0.0722639411687851\n", "Epoch 5543/10000, Training Loss: 0.034714531153440475, Test Loss: 0.07227732986211777\n", "Epoch 5544/10000, Training Loss: 0.03470707684755325, Test Loss: 0.07228998094797134\n", "Epoch 5545/10000, Training Loss: 0.034699615091085434, Test Loss: 0.07230176776647568\n", "Epoch 5546/10000, Training Loss: 0.03469218313694, Test Loss: 0.0723142996430397\n", "Epoch 5547/10000, Training Loss: 0.03468475490808487, Test Loss: 0.07232660055160522\n", "Epoch 5548/10000, Training Loss: 0.03467727452516556, Test Loss: 0.07233927398920059\n", "Epoch 5549/10000, Training Loss: 0.03466983139514923, Test Loss: 0.07235189527273178\n", "Epoch 5550/10000, Training Loss: 0.03466242179274559, Test Loss: 0.0723644495010376\n", "Epoch 5551/10000, Training Loss: 0.034654952585697174, Test Loss: 0.07237685471773148\n", "Epoch 5552/10000, Training Loss: 0.03464752063155174, Test Loss: 0.07239006459712982\n", "Epoch 5553/10000, Training Loss: 0.03464006632566452, Test Loss: 0.0724024549126625\n", "Epoch 5554/10000, Training Loss: 0.03463262692093849, Test Loss: 0.0724143460392952\n", "Epoch 5555/10000, Training Loss: 0.03462522104382515, Test Loss: 0.07242697477340698\n", "Epoch 5556/10000, Training Loss: 0.03461776673793793, Test Loss: 0.07244010269641876\n", "Epoch 5557/10000, Training Loss: 0.034610286355018616, Test Loss: 0.07245264202356339\n", "Epoch 5558/10000, Training Loss: 0.03460290655493736, Test Loss: 0.07246482372283936\n", "Epoch 5559/10000, Training Loss: 0.03459545224905014, Test Loss: 0.0724777802824974\n", "Epoch 5560/10000, Training Loss: 0.03458801284432411, Test Loss: 0.07249037176370621\n", "Epoch 5561/10000, Training Loss: 0.034580592066049576, Test Loss: 0.07250272482633591\n", "Epoch 5562/10000, Training Loss: 0.03457317128777504, Test Loss: 0.07251537591218948\n", "Epoch 5563/10000, Training Loss: 0.03456578031182289, Test Loss: 0.07252857089042664\n", "Epoch 5564/10000, Training Loss: 0.03455832973122597, Test Loss: 0.07254138588905334\n", "Epoch 5565/10000, Training Loss: 0.03455088660120964, Test Loss: 0.0725533664226532\n", "Epoch 5566/10000, Training Loss: 0.034543476998806, Test Loss: 0.07256630808115005\n", "Epoch 5567/10000, Training Loss: 0.03453604876995087, Test Loss: 0.07257919758558273\n", "Epoch 5568/10000, Training Loss: 0.03452862799167633, Test Loss: 0.07259154319763184\n", "Epoch 5569/10000, Training Loss: 0.03452122211456299, Test Loss: 0.07260406762361526\n", "Epoch 5570/10000, Training Loss: 0.03451378643512726, Test Loss: 0.07261688262224197\n", "Epoch 5571/10000, Training Loss: 0.03450639918446541, Test Loss: 0.07262994349002838\n", "Epoch 5572/10000, Training Loss: 0.034498948603868484, Test Loss: 0.07264278084039688\n", "Epoch 5573/10000, Training Loss: 0.034491561353206635, Test Loss: 0.07265514880418777\n", "Epoch 5574/10000, Training Loss: 0.034484151750802994, Test Loss: 0.07266796380281448\n", "Epoch 5575/10000, Training Loss: 0.03447674214839935, Test Loss: 0.0726814717054367\n", "Epoch 5576/10000, Training Loss: 0.0344693548977375, Test Loss: 0.07269435375928879\n", "Epoch 5577/10000, Training Loss: 0.03446193411946297, Test Loss: 0.0727061778306961\n", "Epoch 5578/10000, Training Loss: 0.03445452079176903, Test Loss: 0.07271871715784073\n", "Epoch 5579/10000, Training Loss: 0.03444709628820419, Test Loss: 0.07273198664188385\n", "Epoch 5580/10000, Training Loss: 0.03443972393870354, Test Loss: 0.07274530082941055\n", "Epoch 5581/10000, Training Loss: 0.034432291984558105, Test Loss: 0.07275760173797607\n", "Epoch 5582/10000, Training Loss: 0.03442490100860596, Test Loss: 0.0727698802947998\n", "Epoch 5583/10000, Training Loss: 0.03441746532917023, Test Loss: 0.07278327643871307\n", "Epoch 5584/10000, Training Loss: 0.03441009670495987, Test Loss: 0.0727965459227562\n", "Epoch 5585/10000, Training Loss: 0.034402716904878616, Test Loss: 0.07280933856964111\n", "Epoch 5586/10000, Training Loss: 0.034395307302474976, Test Loss: 0.07282183319330215\n", "Epoch 5587/10000, Training Loss: 0.03438789024949074, Test Loss: 0.07283470034599304\n", "Epoch 5588/10000, Training Loss: 0.034380510449409485, Test Loss: 0.0728478655219078\n", "Epoch 5589/10000, Training Loss: 0.034373145550489426, Test Loss: 0.07286076247692108\n", "Epoch 5590/10000, Training Loss: 0.034365709871053696, Test Loss: 0.07287333905696869\n", "Epoch 5591/10000, Training Loss: 0.034358344972133636, Test Loss: 0.07288625836372375\n", "Epoch 5592/10000, Training Loss: 0.03435097634792328, Test Loss: 0.07289940118789673\n", "Epoch 5593/10000, Training Loss: 0.03434358909726143, Test Loss: 0.07291276007890701\n", "Epoch 5594/10000, Training Loss: 0.034336186945438385, Test Loss: 0.07292558997869492\n", "Epoch 5595/10000, Training Loss: 0.034328799694776535, Test Loss: 0.0729384571313858\n", "Epoch 5596/10000, Training Loss: 0.03432140126824379, Test Loss: 0.07295133173465729\n", "Epoch 5597/10000, Training Loss: 0.03431403636932373, Test Loss: 0.07296404987573624\n", "Epoch 5598/10000, Training Loss: 0.03430662304162979, Test Loss: 0.0729769691824913\n", "Epoch 5599/10000, Training Loss: 0.034299273043870926, Test Loss: 0.07299017161130905\n", "Epoch 5600/10000, Training Loss: 0.03429187089204788, Test Loss: 0.07300356030464172\n", "Epoch 5601/10000, Training Loss: 0.03428448364138603, Test Loss: 0.0730164423584938\n", "Epoch 5602/10000, Training Loss: 0.03427717462182045, Test Loss: 0.07302884012460709\n", "Epoch 5603/10000, Training Loss: 0.034269776195287704, Test Loss: 0.07304169982671738\n", "Epoch 5604/10000, Training Loss: 0.03426237404346466, Test Loss: 0.07305564731359482\n", "Epoch 5605/10000, Training Loss: 0.034255050122737885, Test Loss: 0.07306919991970062\n", "Epoch 5606/10000, Training Loss: 0.03424764797091484, Test Loss: 0.07308101654052734\n", "Epoch 5607/10000, Training Loss: 0.0342402458190918, Test Loss: 0.07309363037347794\n", "Epoch 5608/10000, Training Loss: 0.03423292562365532, Test Loss: 0.07310762256383896\n", "Epoch 5609/10000, Training Loss: 0.03422553464770317, Test Loss: 0.07312145829200745\n", "Epoch 5610/10000, Training Loss: 0.0342181921005249, Test Loss: 0.07313395291566849\n", "Epoch 5611/10000, Training Loss: 0.03421081230044365, Test Loss: 0.07314611971378326\n", "Epoch 5612/10000, Training Loss: 0.0342034213244915, Test Loss: 0.07315947860479355\n", "Epoch 5613/10000, Training Loss: 0.03419610485434532, Test Loss: 0.07317382097244263\n", "Epoch 5614/10000, Training Loss: 0.034188732504844666, Test Loss: 0.07318679988384247\n", "Epoch 5615/10000, Training Loss: 0.03418135270476341, Test Loss: 0.07319914549589157\n", "Epoch 5616/10000, Training Loss: 0.03417402133345604, Test Loss: 0.07321246713399887\n", "Epoch 5617/10000, Training Loss: 0.034166645258665085, Test Loss: 0.07322610914707184\n", "Epoch 5618/10000, Training Loss: 0.034159332513809204, Test Loss: 0.07323888689279556\n", "Epoch 5619/10000, Training Loss: 0.034151945263147354, Test Loss: 0.07325161248445511\n", "Epoch 5620/10000, Training Loss: 0.03414464369416237, Test Loss: 0.07326492667198181\n", "Epoch 5621/10000, Training Loss: 0.03413727879524231, Test Loss: 0.07327910512685776\n", "Epoch 5622/10000, Training Loss: 0.03412992134690285, Test Loss: 0.07329204678535461\n", "Epoch 5623/10000, Training Loss: 0.034122586250305176, Test Loss: 0.07330441474914551\n", "Epoch 5624/10000, Training Loss: 0.03411518782377243, Test Loss: 0.07331757992506027\n", "Epoch 5625/10000, Training Loss: 0.03410787507891655, Test Loss: 0.07333163917064667\n", "Epoch 5626/10000, Training Loss: 0.03410055488348007, Test Loss: 0.07334495335817337\n", "Epoch 5627/10000, Training Loss: 0.03409320488572121, Test Loss: 0.07335752248764038\n", "Epoch 5628/10000, Training Loss: 0.034085847437381744, Test Loss: 0.07337076961994171\n", "Epoch 5629/10000, Training Loss: 0.03407847881317139, Test Loss: 0.07338447868824005\n", "Epoch 5630/10000, Training Loss: 0.03407116234302521, Test Loss: 0.07339831441640854\n", "Epoch 5631/10000, Training Loss: 0.03406386449933052, Test Loss: 0.07341122627258301\n", "Epoch 5632/10000, Training Loss: 0.034056488424539566, Test Loss: 0.07342355698347092\n", "Epoch 5633/10000, Training Loss: 0.034049127250909805, Test Loss: 0.07343710958957672\n", "Epoch 5634/10000, Training Loss: 0.03404184430837631, Test Loss: 0.07345148175954819\n", "Epoch 5635/10000, Training Loss: 0.034034501761198044, Test Loss: 0.07346510142087936\n", "Epoch 5636/10000, Training Loss: 0.03402717411518097, Test Loss: 0.0734771192073822\n", "Epoch 5637/10000, Training Loss: 0.034019872546195984, Test Loss: 0.07349042594432831\n", "Epoch 5638/10000, Training Loss: 0.03401250019669533, Test Loss: 0.07350447028875351\n", "Epoch 5639/10000, Training Loss: 0.03400520607829094, Test Loss: 0.07351838052272797\n", "Epoch 5640/10000, Training Loss: 0.03399789333343506, Test Loss: 0.07353100180625916\n", "Epoch 5641/10000, Training Loss: 0.033990565687417984, Test Loss: 0.07354424893856049\n", "Epoch 5642/10000, Training Loss: 0.03398321941494942, Test Loss: 0.07355771958827972\n", "Epoch 5643/10000, Training Loss: 0.03397592157125473, Test Loss: 0.07357150316238403\n", "Epoch 5644/10000, Training Loss: 0.03396858274936676, Test Loss: 0.073585145175457\n", "Epoch 5645/10000, Training Loss: 0.03396129235625267, Test Loss: 0.07359810173511505\n", "Epoch 5646/10000, Training Loss: 0.0339539535343647, Test Loss: 0.0736110508441925\n", "Epoch 5647/10000, Training Loss: 0.03394665569067001, Test Loss: 0.07362498342990875\n", "Epoch 5648/10000, Training Loss: 0.03393935039639473, Test Loss: 0.07363879680633545\n", "Epoch 5649/10000, Training Loss: 0.03393201157450676, Test Loss: 0.07365194708108902\n", "Epoch 5650/10000, Training Loss: 0.03392470255494118, Test Loss: 0.07366501539945602\n", "Epoch 5651/10000, Training Loss: 0.033917386084795, Test Loss: 0.0736791342496872\n", "Epoch 5652/10000, Training Loss: 0.03391008451581001, Test Loss: 0.07369301468133926\n", "Epoch 5653/10000, Training Loss: 0.03390275686979294, Test Loss: 0.07370618730783463\n", "Epoch 5654/10000, Training Loss: 0.03389543667435646, Test Loss: 0.07371878623962402\n", "Epoch 5655/10000, Training Loss: 0.033888187259435654, Test Loss: 0.07373300939798355\n", "Epoch 5656/10000, Training Loss: 0.03388085216283798, Test Loss: 0.07374696433544159\n", "Epoch 5657/10000, Training Loss: 0.033873558044433594, Test Loss: 0.07375995814800262\n", "Epoch 5658/10000, Training Loss: 0.03386625275015831, Test Loss: 0.0737733393907547\n", "Epoch 5659/10000, Training Loss: 0.03385896235704422, Test Loss: 0.07378727197647095\n", "Epoch 5660/10000, Training Loss: 0.03385164216160774, Test Loss: 0.07380103319883347\n", "Epoch 5661/10000, Training Loss: 0.033844348043203354, Test Loss: 0.0738142803311348\n", "Epoch 5662/10000, Training Loss: 0.03383704647421837, Test Loss: 0.07382744550704956\n", "Epoch 5663/10000, Training Loss: 0.033829785883426666, Test Loss: 0.07384131103754044\n", "Epoch 5664/10000, Training Loss: 0.033822495490312576, Test Loss: 0.07385534793138504\n", "Epoch 5665/10000, Training Loss: 0.033815208822488785, Test Loss: 0.07386890053749084\n", "Epoch 5666/10000, Training Loss: 0.03380788862705231, Test Loss: 0.07388228178024292\n", "Epoch 5667/10000, Training Loss: 0.03380059450864792, Test Loss: 0.0738959088921547\n", "Epoch 5668/10000, Training Loss: 0.033793315291404724, Test Loss: 0.07390990853309631\n", "Epoch 5669/10000, Training Loss: 0.03378602862358093, Test Loss: 0.07392337918281555\n", "Epoch 5670/10000, Training Loss: 0.033778753131628036, Test Loss: 0.0739365890622139\n", "Epoch 5671/10000, Training Loss: 0.03377147391438484, Test Loss: 0.07395043969154358\n", "Epoch 5672/10000, Training Loss: 0.033764176070690155, Test Loss: 0.07396484166383743\n", "Epoch 5673/10000, Training Loss: 0.03375691547989845, Test Loss: 0.07397818565368652\n", "Epoch 5674/10000, Training Loss: 0.03374962881207466, Test Loss: 0.07399128377437592\n", "Epoch 5675/10000, Training Loss: 0.03374237194657326, Test Loss: 0.07400529831647873\n", "Epoch 5676/10000, Training Loss: 0.03373509645462036, Test Loss: 0.07402020692825317\n", "Epoch 5677/10000, Training Loss: 0.033727824687957764, Test Loss: 0.07403317838907242\n", "Epoch 5678/10000, Training Loss: 0.03372052684426308, Test Loss: 0.07404553890228271\n", "Epoch 5679/10000, Training Loss: 0.03371329978108406, Test Loss: 0.07405994087457657\n", "Epoch 5680/10000, Training Loss: 0.03370598331093788, Test Loss: 0.07407525181770325\n", "Epoch 5681/10000, Training Loss: 0.03369869291782379, Test Loss: 0.0740886777639389\n", "Epoch 5682/10000, Training Loss: 0.03369145467877388, Test Loss: 0.07410085201263428\n", "Epoch 5683/10000, Training Loss: 0.03368420898914337, Test Loss: 0.07411465048789978\n", "Epoch 5684/10000, Training Loss: 0.03367694094777107, Test Loss: 0.07412993907928467\n", "Epoch 5685/10000, Training Loss: 0.03366969898343086, Test Loss: 0.0741441398859024\n", "Epoch 5686/10000, Training Loss: 0.033662378787994385, Test Loss: 0.07415677607059479\n", "Epoch 5687/10000, Training Loss: 0.03365512937307358, Test Loss: 0.0741700604557991\n", "Epoch 5688/10000, Training Loss: 0.033647894859313965, Test Loss: 0.07418487221002579\n", "Epoch 5689/10000, Training Loss: 0.033640630543231964, Test Loss: 0.07419918477535248\n", "Epoch 5690/10000, Training Loss: 0.03363335132598877, Test Loss: 0.07421232759952545\n", "Epoch 5691/10000, Training Loss: 0.03362608700990677, Test Loss: 0.07422538101673126\n", "Epoch 5692/10000, Training Loss: 0.033618856221437454, Test Loss: 0.07423996180295944\n", "Epoch 5693/10000, Training Loss: 0.033611614257097244, Test Loss: 0.07425481826066971\n", "Epoch 5694/10000, Training Loss: 0.033604346215724945, Test Loss: 0.07426835596561432\n", "Epoch 5695/10000, Training Loss: 0.03359709307551384, Test Loss: 0.07427886128425598\n", "Epoch 5696/10000, Training Loss: 0.03358982875943184, Test Loss: 0.07429271191358566\n", "Epoch 5697/10000, Training Loss: 0.033582597970962524, Test Loss: 0.07430963218212128\n", "Epoch 5698/10000, Training Loss: 0.03357536345720291, Test Loss: 0.07432445138692856\n", "Epoch 5699/10000, Training Loss: 0.033568110316991806, Test Loss: 0.07433532178401947\n", "Epoch 5700/10000, Training Loss: 0.03356088697910309, Test Loss: 0.07434854656457901\n", "Epoch 5701/10000, Training Loss: 0.033553652465343475, Test Loss: 0.07436451315879822\n", "Epoch 5702/10000, Training Loss: 0.03354640305042267, Test Loss: 0.07437971234321594\n", "Epoch 5703/10000, Training Loss: 0.033539190888404846, Test Loss: 0.07439181953668594\n", "Epoch 5704/10000, Training Loss: 0.03353196755051613, Test Loss: 0.0744047537446022\n", "Epoch 5705/10000, Training Loss: 0.03352471813559532, Test Loss: 0.07442020624876022\n", "Epoch 5706/10000, Training Loss: 0.03351748362183571, Test Loss: 0.07443497329950333\n", "Epoch 5707/10000, Training Loss: 0.03351025655865669, Test Loss: 0.07444783300161362\n", "Epoch 5708/10000, Training Loss: 0.03350301459431648, Test Loss: 0.07446078956127167\n", "Epoch 5709/10000, Training Loss: 0.03349578008055687, Test Loss: 0.0744759812951088\n", "Epoch 5710/10000, Training Loss: 0.03348859027028084, Test Loss: 0.07449113577604294\n", "Epoch 5711/10000, Training Loss: 0.03348136320710182, Test Loss: 0.07450386136770248\n", "Epoch 5712/10000, Training Loss: 0.03347411006689072, Test Loss: 0.07451676577329636\n", "Epoch 5713/10000, Training Loss: 0.03346693143248558, Test Loss: 0.07453172653913498\n", "Epoch 5714/10000, Training Loss: 0.033459682017564774, Test Loss: 0.07454677671194077\n", "Epoch 5715/10000, Training Loss: 0.03345249965786934, Test Loss: 0.07456067949533463\n", "Epoch 5716/10000, Training Loss: 0.03344526141881943, Test Loss: 0.07457338273525238\n", "Epoch 5717/10000, Training Loss: 0.03343804180622101, Test Loss: 0.07458774745464325\n", "Epoch 5718/10000, Training Loss: 0.0334307923913002, Test Loss: 0.07460302114486694\n", "Epoch 5719/10000, Training Loss: 0.03342359885573387, Test Loss: 0.07461673021316528\n", "Epoch 5720/10000, Training Loss: 0.03341636806726456, Test Loss: 0.07462966442108154\n", "Epoch 5721/10000, Training Loss: 0.033409181982278824, Test Loss: 0.07464396208524704\n", "Epoch 5722/10000, Training Loss: 0.033401984721422195, Test Loss: 0.07465929538011551\n", "Epoch 5723/10000, Training Loss: 0.03339476138353348, Test Loss: 0.07467324286699295\n", "Epoch 5724/10000, Training Loss: 0.03338756412267685, Test Loss: 0.07468657940626144\n", "Epoch 5725/10000, Training Loss: 0.03338037058711052, Test Loss: 0.0747007355093956\n", "Epoch 5726/10000, Training Loss: 0.0333731509745121, Test Loss: 0.07471543550491333\n", "Epoch 5727/10000, Training Loss: 0.033365968614816666, Test Loss: 0.07472959905862808\n", "Epoch 5728/10000, Training Loss: 0.03335876017808914, Test Loss: 0.07474350929260254\n", "Epoch 5729/10000, Training Loss: 0.03335157409310341, Test Loss: 0.07475743442773819\n", "Epoch 5730/10000, Training Loss: 0.03334434702992439, Test Loss: 0.07477189600467682\n", "Epoch 5731/10000, Training Loss: 0.03333715721964836, Test Loss: 0.07478655129671097\n", "Epoch 5732/10000, Training Loss: 0.03332993760704994, Test Loss: 0.07480037957429886\n", "Epoch 5733/10000, Training Loss: 0.03332274407148361, Test Loss: 0.07481411099433899\n", "Epoch 5734/10000, Training Loss: 0.03331558406352997, Test Loss: 0.07482904940843582\n", "Epoch 5735/10000, Training Loss: 0.03330836072564125, Test Loss: 0.07484342902898788\n", "Epoch 5736/10000, Training Loss: 0.03330120071768761, Test Loss: 0.07485730201005936\n", "Epoch 5737/10000, Training Loss: 0.03329402953386307, Test Loss: 0.07487142831087112\n", "Epoch 5738/10000, Training Loss: 0.03328682854771614, Test Loss: 0.07488571107387543\n", "Epoch 5739/10000, Training Loss: 0.033279672265052795, Test Loss: 0.07490033656358719\n", "Epoch 5740/10000, Training Loss: 0.033272456377744675, Test Loss: 0.07491487264633179\n", "Epoch 5741/10000, Training Loss: 0.03326527774333954, Test Loss: 0.07492852956056595\n", "Epoch 5742/10000, Training Loss: 0.03325808793306351, Test Loss: 0.0749429315328598\n", "Epoch 5743/10000, Training Loss: 0.03325090557336807, Test Loss: 0.074957475066185\n", "Epoch 5744/10000, Training Loss: 0.03324371203780174, Test Loss: 0.07497191429138184\n", "Epoch 5745/10000, Training Loss: 0.033236533403396606, Test Loss: 0.07498600333929062\n", "Epoch 5746/10000, Training Loss: 0.033229369670152664, Test Loss: 0.0750005692243576\n", "Epoch 5747/10000, Training Loss: 0.03322223573923111, Test Loss: 0.07501517981290817\n", "Epoch 5748/10000, Training Loss: 0.03321507200598717, Test Loss: 0.07502920925617218\n", "Epoch 5749/10000, Training Loss: 0.03320789337158203, Test Loss: 0.07504324615001678\n", "Epoch 5750/10000, Training Loss: 0.03320072591304779, Test Loss: 0.07505809515714645\n", "Epoch 5751/10000, Training Loss: 0.033193521201610565, Test Loss: 0.07507269084453583\n", "Epoch 5752/10000, Training Loss: 0.03318633884191513, Test Loss: 0.07508676499128342\n", "Epoch 5753/10000, Training Loss: 0.033179204910993576, Test Loss: 0.07510124891996384\n", "Epoch 5754/10000, Training Loss: 0.033172037452459335, Test Loss: 0.07511584460735321\n", "Epoch 5755/10000, Training Loss: 0.03316487371921539, Test Loss: 0.07513009756803513\n", "Epoch 5756/10000, Training Loss: 0.03315768390893936, Test Loss: 0.075144462287426\n", "Epoch 5757/10000, Training Loss: 0.033150576055049896, Test Loss: 0.07515935599803925\n", "Epoch 5758/10000, Training Loss: 0.03314340114593506, Test Loss: 0.0751739889383316\n", "Epoch 5759/10000, Training Loss: 0.033136237412691116, Test Loss: 0.07518793642520905\n", "Epoch 5760/10000, Training Loss: 0.033129096031188965, Test Loss: 0.07520230114459991\n", "Epoch 5761/10000, Training Loss: 0.03312194347381592, Test Loss: 0.07521725445985794\n", "Epoch 5762/10000, Training Loss: 0.033114783465862274, Test Loss: 0.07523200660943985\n", "Epoch 5763/10000, Training Loss: 0.03310762345790863, Test Loss: 0.0752459317445755\n", "Epoch 5764/10000, Training Loss: 0.03310050442814827, Test Loss: 0.07526073604822159\n", "Epoch 5765/10000, Training Loss: 0.03309332951903343, Test Loss: 0.0752754658460617\n", "Epoch 5766/10000, Training Loss: 0.033086199313402176, Test Loss: 0.07529017329216003\n", "Epoch 5767/10000, Training Loss: 0.03307901322841644, Test Loss: 0.07530414313077927\n", "Epoch 5768/10000, Training Loss: 0.03307192400097847, Test Loss: 0.07531862705945969\n", "Epoch 5769/10000, Training Loss: 0.03306475654244423, Test Loss: 0.07533376663923264\n", "Epoch 5770/10000, Training Loss: 0.03305760398507118, Test Loss: 0.07534845173358917\n", "Epoch 5771/10000, Training Loss: 0.03305044025182724, Test Loss: 0.07536262273788452\n", "Epoch 5772/10000, Training Loss: 0.03304336965084076, Test Loss: 0.07537726312875748\n", "Epoch 5773/10000, Training Loss: 0.03303617611527443, Test Loss: 0.07539250701665878\n", "Epoch 5774/10000, Training Loss: 0.03302905336022377, Test Loss: 0.07540713995695114\n", "Epoch 5775/10000, Training Loss: 0.033021893352270126, Test Loss: 0.07542123645544052\n", "Epoch 5776/10000, Training Loss: 0.033014796674251556, Test Loss: 0.07543555647134781\n", "Epoch 5777/10000, Training Loss: 0.033007677644491196, Test Loss: 0.07545087486505508\n", "Epoch 5778/10000, Training Loss: 0.03300052881240845, Test Loss: 0.07546570152044296\n", "Epoch 5779/10000, Training Loss: 0.03299340605735779, Test Loss: 0.07548006623983383\n", "Epoch 5780/10000, Training Loss: 0.03298627957701683, Test Loss: 0.07549440115690231\n", "Epoch 5781/10000, Training Loss: 0.03297913819551468, Test Loss: 0.07550972700119019\n", "Epoch 5782/10000, Training Loss: 0.03297201544046402, Test Loss: 0.07552504539489746\n", "Epoch 5783/10000, Training Loss: 0.03296489268541336, Test Loss: 0.07553905248641968\n", "Epoch 5784/10000, Training Loss: 0.03295779600739479, Test Loss: 0.07555294781923294\n", "Epoch 5785/10000, Training Loss: 0.032950688153505325, Test Loss: 0.07556837797164917\n", "Epoch 5786/10000, Training Loss: 0.03294352442026138, Test Loss: 0.07558391243219376\n", "Epoch 5787/10000, Training Loss: 0.03293638303875923, Test Loss: 0.07559801638126373\n", "Epoch 5788/10000, Training Loss: 0.032929278910160065, Test Loss: 0.07561235129833221\n", "Epoch 5789/10000, Training Loss: 0.03292219340801239, Test Loss: 0.07562761753797531\n", "Epoch 5790/10000, Training Loss: 0.03291508927941322, Test Loss: 0.07564292848110199\n", "Epoch 5791/10000, Training Loss: 0.03290792927145958, Test Loss: 0.07565760612487793\n", "Epoch 5792/10000, Training Loss: 0.03290079906582832, Test Loss: 0.0756719633936882\n", "Epoch 5793/10000, Training Loss: 0.03289375826716423, Test Loss: 0.07568664848804474\n", "Epoch 5794/10000, Training Loss: 0.03288664668798447, Test Loss: 0.07570190727710724\n", "Epoch 5795/10000, Training Loss: 0.0328795351088047, Test Loss: 0.07571686059236526\n", "Epoch 5796/10000, Training Loss: 0.03287241607904434, Test Loss: 0.0757312998175621\n", "Epoch 5797/10000, Training Loss: 0.03286530077457428, Test Loss: 0.07574620842933655\n", "Epoch 5798/10000, Training Loss: 0.03285818174481392, Test Loss: 0.07576150447130203\n", "Epoch 5799/10000, Training Loss: 0.03285112604498863, Test Loss: 0.07577655464410782\n", "Epoch 5800/10000, Training Loss: 0.03284403309226036, Test Loss: 0.07579102367162704\n", "Epoch 5801/10000, Training Loss: 0.032836899161338806, Test Loss: 0.07580594718456268\n", "Epoch 5802/10000, Training Loss: 0.032829802483320236, Test Loss: 0.0758209228515625\n", "Epoch 5803/10000, Training Loss: 0.032822709530591965, Test Loss: 0.07583596557378769\n", "Epoch 5804/10000, Training Loss: 0.032815609127283096, Test Loss: 0.07585081458091736\n", "Epoch 5805/10000, Training Loss: 0.03280853480100632, Test Loss: 0.07586576789617538\n", "Epoch 5806/10000, Training Loss: 0.032801467925310135, Test Loss: 0.07588069885969162\n", "Epoch 5807/10000, Training Loss: 0.03279433771967888, Test Loss: 0.07589609175920486\n", "Epoch 5808/10000, Training Loss: 0.0327872671186924, Test Loss: 0.07591123133897781\n", "Epoch 5809/10000, Training Loss: 0.03278020769357681, Test Loss: 0.07592575252056122\n", "Epoch 5810/10000, Training Loss: 0.03277309983968735, Test Loss: 0.07594029605388641\n", "Epoch 5811/10000, Training Loss: 0.03276602923870087, Test Loss: 0.0759558230638504\n", "Epoch 5812/10000, Training Loss: 0.032758958637714386, Test Loss: 0.07596682757139206\n", "Epoch 5813/10000, Training Loss: 0.0327518954873085, Test Loss: 0.07598142325878143\n", "Epoch 5814/10000, Training Loss: 0.03274478763341904, Test Loss: 0.07600000500679016\n", "Epoch 5815/10000, Training Loss: 0.032737717032432556, Test Loss: 0.07601532340049744\n", "Epoch 5816/10000, Training Loss: 0.03273062780499458, Test Loss: 0.07602734863758087\n", "Epoch 5817/10000, Training Loss: 0.032723568379879, Test Loss: 0.07604213058948517\n", "Epoch 5818/10000, Training Loss: 0.032716505229473114, Test Loss: 0.0760597288608551\n", "Epoch 5819/10000, Training Loss: 0.03270943462848663, Test Loss: 0.07607491314411163\n", "Epoch 5820/10000, Training Loss: 0.03270239382982254, Test Loss: 0.07608723640441895\n", "Epoch 5821/10000, Training Loss: 0.03269532695412636, Test Loss: 0.07610231637954712\n", "Epoch 5822/10000, Training Loss: 0.03268825262784958, Test Loss: 0.07611957937479019\n", "Epoch 5823/10000, Training Loss: 0.03268119692802429, Test Loss: 0.07613461464643478\n", "Epoch 5824/10000, Training Loss: 0.03267412632703781, Test Loss: 0.07614760845899582\n", "Epoch 5825/10000, Training Loss: 0.03266706317663193, Test Loss: 0.07616253197193146\n", "Epoch 5826/10000, Training Loss: 0.03266002982854843, Test Loss: 0.07617935538291931\n", "Epoch 5827/10000, Training Loss: 0.03265296667814255, Test Loss: 0.07619490474462509\n", "Epoch 5828/10000, Training Loss: 0.032645922154188156, Test Loss: 0.07620812952518463\n", "Epoch 5829/10000, Training Loss: 0.03263884782791138, Test Loss: 0.07622288167476654\n", "Epoch 5830/10000, Training Loss: 0.03263182193040848, Test Loss: 0.07623917609453201\n", "Epoch 5831/10000, Training Loss: 0.032624777406454086, Test Loss: 0.07625441253185272\n", "Epoch 5832/10000, Training Loss: 0.03261773660778999, Test Loss: 0.07626847177743912\n", "Epoch 5833/10000, Training Loss: 0.032610684633255005, Test Loss: 0.07628334313631058\n", "Epoch 5834/10000, Training Loss: 0.03260361775755882, Test Loss: 0.07629942893981934\n", "Epoch 5835/10000, Training Loss: 0.03259661793708801, Test Loss: 0.07631481438875198\n", "Epoch 5836/10000, Training Loss: 0.032589543610811234, Test Loss: 0.07632914185523987\n", "Epoch 5837/10000, Training Loss: 0.03258250653743744, Test Loss: 0.07634416222572327\n", "Epoch 5838/10000, Training Loss: 0.03257546201348305, Test Loss: 0.07635997235774994\n", "Epoch 5839/10000, Training Loss: 0.03256843984127045, Test Loss: 0.07637513428926468\n", "Epoch 5840/10000, Training Loss: 0.032561399042606354, Test Loss: 0.07638945430517197\n", "Epoch 5841/10000, Training Loss: 0.032554373145103455, Test Loss: 0.0764046460390091\n", "Epoch 5842/10000, Training Loss: 0.032547321170568466, Test Loss: 0.07642049342393875\n", "Epoch 5843/10000, Training Loss: 0.03254029154777527, Test Loss: 0.0764358788728714\n", "Epoch 5844/10000, Training Loss: 0.03253326192498207, Test Loss: 0.07645030319690704\n", "Epoch 5845/10000, Training Loss: 0.032526254653930664, Test Loss: 0.07646536827087402\n", "Epoch 5846/10000, Training Loss: 0.032519202679395676, Test Loss: 0.07648122310638428\n", "Epoch 5847/10000, Training Loss: 0.032512180507183075, Test Loss: 0.07649707049131393\n", "Epoch 5848/10000, Training Loss: 0.032505180686712265, Test Loss: 0.07651150971651077\n", "Epoch 5849/10000, Training Loss: 0.032498136162757874, Test Loss: 0.0765264481306076\n", "Epoch 5850/10000, Training Loss: 0.03249114006757736, Test Loss: 0.07654211670160294\n", "Epoch 5851/10000, Training Loss: 0.03248410299420357, Test Loss: 0.07655756175518036\n", "Epoch 5852/10000, Training Loss: 0.03247707709670067, Test Loss: 0.07657254487276077\n", "Epoch 5853/10000, Training Loss: 0.03247006610035896, Test Loss: 0.0765877515077591\n", "Epoch 5854/10000, Training Loss: 0.032463062554597855, Test Loss: 0.07660330832004547\n", "Epoch 5855/10000, Training Loss: 0.032456040382385254, Test Loss: 0.07661888003349304\n", "Epoch 5856/10000, Training Loss: 0.032449036836624146, Test Loss: 0.07663390785455704\n", "Epoch 5857/10000, Training Loss: 0.03244202211499214, Test Loss: 0.07664938271045685\n", "Epoch 5858/10000, Training Loss: 0.03243502601981163, Test Loss: 0.07666507363319397\n", "Epoch 5859/10000, Training Loss: 0.03242800757288933, Test Loss: 0.07668031752109528\n", "Epoch 5860/10000, Training Loss: 0.032421015202999115, Test Loss: 0.07669518142938614\n", "Epoch 5861/10000, Training Loss: 0.03241401165723801, Test Loss: 0.07671051472425461\n", "Epoch 5862/10000, Training Loss: 0.03240697830915451, Test Loss: 0.07672658562660217\n", "Epoch 5863/10000, Training Loss: 0.032399993389844894, Test Loss: 0.07674196362495422\n", "Epoch 5864/10000, Training Loss: 0.03239301219582558, Test Loss: 0.07675681263208389\n", "Epoch 5865/10000, Training Loss: 0.03238598257303238, Test Loss: 0.07677247375249863\n", "Epoch 5866/10000, Training Loss: 0.03237901255488396, Test Loss: 0.07678832858800888\n", "Epoch 5867/10000, Training Loss: 0.03237201273441315, Test Loss: 0.07680393010377884\n", "Epoch 5868/10000, Training Loss: 0.03236500918865204, Test Loss: 0.0768190547823906\n", "Epoch 5869/10000, Training Loss: 0.03235803544521332, Test Loss: 0.07683455944061279\n", "Epoch 5870/10000, Training Loss: 0.032350994646549225, Test Loss: 0.07684994488954544\n", "Epoch 5871/10000, Training Loss: 0.0323440283536911, Test Loss: 0.07686558365821838\n", "Epoch 5872/10000, Training Loss: 0.032337069511413574, Test Loss: 0.0768810287117958\n", "Epoch 5873/10000, Training Loss: 0.03233007341623306, Test Loss: 0.07689672708511353\n", "Epoch 5874/10000, Training Loss: 0.03232308477163315, Test Loss: 0.07691220194101334\n", "Epoch 5875/10000, Training Loss: 0.03231612965464592, Test Loss: 0.07692772895097733\n", "Epoch 5876/10000, Training Loss: 0.03230910003185272, Test Loss: 0.07694333791732788\n", "Epoch 5877/10000, Training Loss: 0.0323021300137043, Test Loss: 0.0769592821598053\n", "Epoch 5878/10000, Training Loss: 0.03229513019323349, Test Loss: 0.07697483152151108\n", "Epoch 5879/10000, Training Loss: 0.03228819742798805, Test Loss: 0.07699012756347656\n", "Epoch 5880/10000, Training Loss: 0.03228118270635605, Test Loss: 0.07700561732053757\n", "Epoch 5881/10000, Training Loss: 0.03227423503994942, Test Loss: 0.07702159881591797\n", "Epoch 5882/10000, Training Loss: 0.03226723521947861, Test Loss: 0.07703721523284912\n", "Epoch 5883/10000, Training Loss: 0.03226027265191078, Test Loss: 0.07705259323120117\n", "Epoch 5884/10000, Training Loss: 0.03225332126021385, Test Loss: 0.07706843316555023\n", "Epoch 5885/10000, Training Loss: 0.03224632143974304, Test Loss: 0.07708409428596497\n", "Epoch 5886/10000, Training Loss: 0.032239362597465515, Test Loss: 0.07709978520870209\n", "Epoch 5887/10000, Training Loss: 0.032232414931058884, Test Loss: 0.07711575925350189\n", "Epoch 5888/10000, Training Loss: 0.03222542628645897, Test Loss: 0.07713155448436737\n", "Epoch 5889/10000, Training Loss: 0.03221845254302025, Test Loss: 0.07714691013097763\n", "Epoch 5890/10000, Training Loss: 0.032211508601903915, Test Loss: 0.07716250419616699\n", "Epoch 5891/10000, Training Loss: 0.03220457211136818, Test Loss: 0.07717862725257874\n", "Epoch 5892/10000, Training Loss: 0.03219760209321976, Test Loss: 0.0771944671869278\n", "Epoch 5893/10000, Training Loss: 0.03219061717391014, Test Loss: 0.0772097259759903\n", "Epoch 5894/10000, Training Loss: 0.03218366950750351, Test Loss: 0.0772257074713707\n", "Epoch 5895/10000, Training Loss: 0.032176706939935684, Test Loss: 0.07724184542894363\n", "Epoch 5896/10000, Training Loss: 0.03216976299881935, Test Loss: 0.0772576555609703\n", "Epoch 5897/10000, Training Loss: 0.032162830233573914, Test Loss: 0.07727335393428802\n", "Epoch 5898/10000, Training Loss: 0.03215588256716728, Test Loss: 0.07728864997625351\n", "Epoch 5899/10000, Training Loss: 0.03214888274669647, Test Loss: 0.07730494439601898\n", "Epoch 5900/10000, Training Loss: 0.03214191645383835, Test Loss: 0.07732068002223969\n", "Epoch 5901/10000, Training Loss: 0.032135020941495895, Test Loss: 0.07733642309904099\n", "Epoch 5902/10000, Training Loss: 0.03212809935212135, Test Loss: 0.07735219597816467\n", "Epoch 5903/10000, Training Loss: 0.032121121883392334, Test Loss: 0.07736818492412567\n", "Epoch 5904/10000, Training Loss: 0.0321141853928566, Test Loss: 0.07738418877124786\n", "Epoch 5905/10000, Training Loss: 0.03210724890232086, Test Loss: 0.07740041613578796\n", "Epoch 5906/10000, Training Loss: 0.03210029378533363, Test Loss: 0.07741596549749374\n", "Epoch 5907/10000, Training Loss: 0.03209339827299118, Test Loss: 0.07743152230978012\n", "Epoch 5908/10000, Training Loss: 0.032086435705423355, Test Loss: 0.07744787633419037\n", "Epoch 5909/10000, Training Loss: 0.03207948058843613, Test Loss: 0.0774640366435051\n", "Epoch 5910/10000, Training Loss: 0.03207257390022278, Test Loss: 0.07747969776391983\n", "Epoch 5911/10000, Training Loss: 0.03206567466259003, Test Loss: 0.07749561965465546\n", "Epoch 5912/10000, Training Loss: 0.03205873817205429, Test Loss: 0.0775119736790657\n", "Epoch 5913/10000, Training Loss: 0.03205178678035736, Test Loss: 0.0775279700756073\n", "Epoch 5914/10000, Training Loss: 0.032044894993305206, Test Loss: 0.07754328101873398\n", "Epoch 5915/10000, Training Loss: 0.03203792870044708, Test Loss: 0.07755956798791885\n", "Epoch 5916/10000, Training Loss: 0.03203100338578224, Test Loss: 0.07757613062858582\n", "Epoch 5917/10000, Training Loss: 0.032024092972278595, Test Loss: 0.07759201526641846\n", "Epoch 5918/10000, Training Loss: 0.032017167657613754, Test Loss: 0.07760737836360931\n", "Epoch 5919/10000, Training Loss: 0.032010242342948914, Test Loss: 0.07762403041124344\n", "Epoch 5920/10000, Training Loss: 0.03200331702828407, Test Loss: 0.07764055579900742\n", "Epoch 5921/10000, Training Loss: 0.03199644014239311, Test Loss: 0.07765621691942215\n", "Epoch 5922/10000, Training Loss: 0.03198951110243797, Test Loss: 0.0776716098189354\n", "Epoch 5923/10000, Training Loss: 0.031982600688934326, Test Loss: 0.07768819481134415\n", "Epoch 5924/10000, Training Loss: 0.031975653022527695, Test Loss: 0.07770480215549469\n", "Epoch 5925/10000, Training Loss: 0.03196875751018524, Test Loss: 0.07772035896778107\n", "Epoch 5926/10000, Training Loss: 0.03196185454726219, Test Loss: 0.07773629575967789\n", "Epoch 5927/10000, Training Loss: 0.03195495903491974, Test Loss: 0.07775290310382843\n", "Epoch 5928/10000, Training Loss: 0.031948022544384, Test Loss: 0.07776924967765808\n", "Epoch 5929/10000, Training Loss: 0.03194112330675125, Test Loss: 0.07778472453355789\n", "Epoch 5930/10000, Training Loss: 0.03193427622318268, Test Loss: 0.07780129462480545\n", "Epoch 5931/10000, Training Loss: 0.03192731365561485, Test Loss: 0.07781799882650375\n", "Epoch 5932/10000, Training Loss: 0.031920451670885086, Test Loss: 0.07783398032188416\n", "Epoch 5933/10000, Training Loss: 0.03191352263092995, Test Loss: 0.07784955203533173\n", "Epoch 5934/10000, Training Loss: 0.03190665692090988, Test Loss: 0.07786601036787033\n", "Epoch 5935/10000, Training Loss: 0.03189973533153534, Test Loss: 0.07788268476724625\n", "Epoch 5936/10000, Training Loss: 0.03189286217093468, Test Loss: 0.07789859175682068\n", "Epoch 5937/10000, Training Loss: 0.03188595548272133, Test Loss: 0.07791446894407272\n", "Epoch 5938/10000, Training Loss: 0.03187905251979828, Test Loss: 0.07793117314577103\n", "Epoch 5939/10000, Training Loss: 0.031872186809778214, Test Loss: 0.07794772833585739\n", "Epoch 5940/10000, Training Loss: 0.03186528757214546, Test Loss: 0.07796347141265869\n", "Epoch 5941/10000, Training Loss: 0.031858425587415695, Test Loss: 0.07797988504171371\n", "Epoch 5942/10000, Training Loss: 0.03185148164629936, Test Loss: 0.07799648493528366\n", "Epoch 5943/10000, Training Loss: 0.031844612210989, Test Loss: 0.07801290601491928\n", "Epoch 5944/10000, Training Loss: 0.03183778002858162, Test Loss: 0.07802914828062057\n", "Epoch 5945/10000, Training Loss: 0.03183086961507797, Test Loss: 0.0780453085899353\n", "Epoch 5946/10000, Training Loss: 0.03182397037744522, Test Loss: 0.07806161791086197\n", "Epoch 5947/10000, Training Loss: 0.031817104667425156, Test Loss: 0.07807806879281998\n", "Epoch 5948/10000, Training Loss: 0.03181023150682449, Test Loss: 0.0780944973230362\n", "Epoch 5949/10000, Training Loss: 0.03180338814854622, Test Loss: 0.07811081409454346\n", "Epoch 5950/10000, Training Loss: 0.031796496361494064, Test Loss: 0.07812739908695221\n", "Epoch 5951/10000, Training Loss: 0.0317896269261837, Test Loss: 0.07814396172761917\n", "Epoch 5952/10000, Training Loss: 0.03178273141384125, Test Loss: 0.0781603530049324\n", "Epoch 5953/10000, Training Loss: 0.03177587687969208, Test Loss: 0.0781763345003128\n", "Epoch 5954/10000, Training Loss: 0.031769026070833206, Test Loss: 0.07819283753633499\n", "Epoch 5955/10000, Training Loss: 0.03176214545965195, Test Loss: 0.07820959389209747\n", "Epoch 5956/10000, Training Loss: 0.03175527974963188, Test Loss: 0.07822626829147339\n", "Epoch 5957/10000, Training Loss: 0.03174843266606331, Test Loss: 0.07824249565601349\n", "Epoch 5958/10000, Training Loss: 0.03174156695604324, Test Loss: 0.07825901359319687\n", "Epoch 5959/10000, Training Loss: 0.031734663993120193, Test Loss: 0.07827550172805786\n", "Epoch 5960/10000, Training Loss: 0.03172784671187401, Test Loss: 0.07829190790653229\n", "Epoch 5961/10000, Training Loss: 0.03172096237540245, Test Loss: 0.07830829918384552\n", "Epoch 5962/10000, Training Loss: 0.03171411156654358, Test Loss: 0.07832520455121994\n", "Epoch 5963/10000, Training Loss: 0.03170729801058769, Test Loss: 0.07834180444478989\n", "Epoch 5964/10000, Training Loss: 0.03170042484998703, Test Loss: 0.0783582255244255\n", "Epoch 5965/10000, Training Loss: 0.03169357776641846, Test Loss: 0.07837489247322083\n", "Epoch 5966/10000, Training Loss: 0.031686700880527496, Test Loss: 0.07839131355285645\n", "Epoch 5967/10000, Training Loss: 0.031679872423410416, Test Loss: 0.07840778678655624\n", "Epoch 5968/10000, Training Loss: 0.03167302906513214, Test Loss: 0.07842454314231873\n", "Epoch 5969/10000, Training Loss: 0.031666193157434464, Test Loss: 0.07844129949808121\n", "Epoch 5970/10000, Training Loss: 0.031659312546253204, Test Loss: 0.07845806330442429\n", "Epoch 5971/10000, Training Loss: 0.031652484089136124, Test Loss: 0.07847440987825394\n", "Epoch 5972/10000, Training Loss: 0.03164565935730934, Test Loss: 0.07849077880382538\n", "Epoch 5973/10000, Training Loss: 0.03163884952664375, Test Loss: 0.07850780338048935\n", "Epoch 5974/10000, Training Loss: 0.03163197264075279, Test Loss: 0.07852460443973541\n", "Epoch 5975/10000, Training Loss: 0.03162515163421631, Test Loss: 0.07854116708040237\n", "Epoch 5976/10000, Training Loss: 0.03161827102303505, Test Loss: 0.07855749875307083\n", "Epoch 5977/10000, Training Loss: 0.031611472368240356, Test Loss: 0.07857480645179749\n", "Epoch 5978/10000, Training Loss: 0.031604621559381485, Test Loss: 0.07859175652265549\n", "Epoch 5979/10000, Training Loss: 0.031597789376974106, Test Loss: 0.07860758155584335\n", "Epoch 5980/10000, Training Loss: 0.03159097954630852, Test Loss: 0.07862456887960434\n", "Epoch 5981/10000, Training Loss: 0.03158414363861084, Test Loss: 0.07864182442426682\n", "Epoch 5982/10000, Training Loss: 0.031577322632074356, Test Loss: 0.07865861058235168\n", "Epoch 5983/10000, Training Loss: 0.031570520251989365, Test Loss: 0.07867460697889328\n", "Epoch 5984/10000, Training Loss: 0.031563710421323776, Test Loss: 0.07869167625904083\n", "Epoch 5985/10000, Training Loss: 0.031556855887174606, Test Loss: 0.07870927453041077\n", "Epoch 5986/10000, Training Loss: 0.031550005078315735, Test Loss: 0.07872538268566132\n", "Epoch 5987/10000, Training Loss: 0.031543221324682236, Test Loss: 0.07874170690774918\n", "Epoch 5988/10000, Training Loss: 0.03153643757104874, Test Loss: 0.07875926047563553\n", "Epoch 5989/10000, Training Loss: 0.03152957931160927, Test Loss: 0.07877664268016815\n", "Epoch 5990/10000, Training Loss: 0.03152276948094368, Test Loss: 0.07879260927438736\n", "Epoch 5991/10000, Training Loss: 0.03151598200201988, Test Loss: 0.07880914956331253\n", "Epoch 5992/10000, Training Loss: 0.031509146094322205, Test Loss: 0.07882673293352127\n", "Epoch 5993/10000, Training Loss: 0.031502366065979004, Test Loss: 0.07884370535612106\n", "Epoch 5994/10000, Training Loss: 0.03149553760886192, Test Loss: 0.07885994762182236\n", "Epoch 5995/10000, Training Loss: 0.031488753855228424, Test Loss: 0.07887697219848633\n", "Epoch 5996/10000, Training Loss: 0.03148191049695015, Test Loss: 0.07889477163553238\n", "Epoch 5997/10000, Training Loss: 0.03147513419389725, Test Loss: 0.07890812307596207\n", "Epoch 5998/10000, Training Loss: 0.03146832063794136, Test Loss: 0.07892464101314545\n", "Epoch 5999/10000, Training Loss: 0.031461525708436966, Test Loss: 0.07894505560398102\n", "Epoch 6000/10000, Training Loss: 0.031454719603061676, Test Loss: 0.07896147668361664\n", "Epoch 6001/10000, Training Loss: 0.031447917222976685, Test Loss: 0.07897543907165527\n", "Epoch 6002/10000, Training Loss: 0.031441185623407364, Test Loss: 0.07899390906095505\n", "Epoch 6003/10000, Training Loss: 0.03143433481454849, Test Loss: 0.07901286333799362\n", "Epoch 6004/10000, Training Loss: 0.03142755851149559, Test Loss: 0.07902839779853821\n", "Epoch 6005/10000, Training Loss: 0.031420737504959106, Test Loss: 0.07904407382011414\n", "Epoch 6006/10000, Training Loss: 0.0314139686524868, Test Loss: 0.07906239479780197\n", "Epoch 6007/10000, Training Loss: 0.03140723705291748, Test Loss: 0.07908032834529877\n", "Epoch 6008/10000, Training Loss: 0.03140045329928398, Test Loss: 0.07909579575061798\n", "Epoch 6009/10000, Training Loss: 0.031393639743328094, Test Loss: 0.0791126936674118\n", "Epoch 6010/10000, Training Loss: 0.031386855989694595, Test Loss: 0.0791308730840683\n", "Epoch 6011/10000, Training Loss: 0.03138007968664169, Test Loss: 0.07914777845144272\n", "Epoch 6012/10000, Training Loss: 0.03137332201004028, Test Loss: 0.07916416972875595\n", "Epoch 6013/10000, Training Loss: 0.03136657178401947, Test Loss: 0.07918133586645126\n", "Epoch 6014/10000, Training Loss: 0.03135973587632179, Test Loss: 0.07919882982969284\n", "Epoch 6015/10000, Training Loss: 0.03135295957326889, Test Loss: 0.07921595871448517\n", "Epoch 6016/10000, Training Loss: 0.03134622424840927, Test Loss: 0.0792328491806984\n", "Epoch 6017/10000, Training Loss: 0.03133944049477577, Test Loss: 0.07924967259168625\n", "Epoch 6018/10000, Training Loss: 0.031332626938819885, Test Loss: 0.07926706969738007\n", "Epoch 6019/10000, Training Loss: 0.03132591396570206, Test Loss: 0.07928459346294403\n", "Epoch 6020/10000, Training Loss: 0.031319133937358856, Test Loss: 0.0793011486530304\n", "Epoch 6021/10000, Training Loss: 0.03131235018372536, Test Loss: 0.07931821048259735\n", "Epoch 6022/10000, Training Loss: 0.03130561485886574, Test Loss: 0.0793360248208046\n", "Epoch 6023/10000, Training Loss: 0.03129881992936134, Test Loss: 0.07935334742069244\n", "Epoch 6024/10000, Training Loss: 0.03129209578037262, Test Loss: 0.07936964929103851\n", "Epoch 6025/10000, Training Loss: 0.031285349279642105, Test Loss: 0.07938689738512039\n", "Epoch 6026/10000, Training Loss: 0.03127855062484741, Test Loss: 0.07940512895584106\n", "Epoch 6027/10000, Training Loss: 0.031271856278181076, Test Loss: 0.07942194491624832\n", "Epoch 6028/10000, Training Loss: 0.031265079975128174, Test Loss: 0.07943835109472275\n", "Epoch 6029/10000, Training Loss: 0.03125831112265587, Test Loss: 0.07945597171783447\n", "Epoch 6030/10000, Training Loss: 0.03125155344605446, Test Loss: 0.07947422564029694\n", "Epoch 6031/10000, Training Loss: 0.031244853511452675, Test Loss: 0.07949063926935196\n", "Epoch 6032/10000, Training Loss: 0.031238075345754623, Test Loss: 0.07950717210769653\n", "Epoch 6033/10000, Training Loss: 0.03123130090534687, Test Loss: 0.07952579855918884\n", "Epoch 6034/10000, Training Loss: 0.031224578619003296, Test Loss: 0.07954304665327072\n", "Epoch 6035/10000, Training Loss: 0.031217819079756737, Test Loss: 0.07955920696258545\n", "Epoch 6036/10000, Training Loss: 0.031211083754897118, Test Loss: 0.07957678288221359\n", "Epoch 6037/10000, Training Loss: 0.031204350292682648, Test Loss: 0.07959554344415665\n", "Epoch 6038/10000, Training Loss: 0.031197592616081238, Test Loss: 0.07961214333772659\n", "Epoch 6039/10000, Training Loss: 0.031190868467092514, Test Loss: 0.07962808012962341\n", "Epoch 6040/10000, Training Loss: 0.031184151768684387, Test Loss: 0.07964672893285751\n", "Epoch 6041/10000, Training Loss: 0.03117740899324417, Test Loss: 0.07966484874486923\n", "Epoch 6042/10000, Training Loss: 0.03117065131664276, Test Loss: 0.0796809270977974\n", "Epoch 6043/10000, Training Loss: 0.031163962557911873, Test Loss: 0.07969697564840317\n", "Epoch 6044/10000, Training Loss: 0.031157206743955612, Test Loss: 0.07971595972776413\n", "Epoch 6045/10000, Training Loss: 0.031150508671998978, Test Loss: 0.07973410934209824\n", "Epoch 6046/10000, Training Loss: 0.031143782660365105, Test Loss: 0.07975012809038162\n", "Epoch 6047/10000, Training Loss: 0.031137049198150635, Test Loss: 0.07976676523685455\n", "Epoch 6048/10000, Training Loss: 0.03113030083477497, Test Loss: 0.07978571206331253\n", "Epoch 6049/10000, Training Loss: 0.031123606488108635, Test Loss: 0.07980364561080933\n", "Epoch 6050/10000, Training Loss: 0.031116904690861702, Test Loss: 0.0798194408416748\n", "Epoch 6051/10000, Training Loss: 0.03111017309129238, Test Loss: 0.07983715087175369\n", "Epoch 6052/10000, Training Loss: 0.031103461980819702, Test Loss: 0.07985623925924301\n", "Epoch 6053/10000, Training Loss: 0.031096776947379112, Test Loss: 0.07987304776906967\n", "Epoch 6054/10000, Training Loss: 0.031090011820197105, Test Loss: 0.07988888770341873\n", "Epoch 6055/10000, Training Loss: 0.031083310022950172, Test Loss: 0.07990759611129761\n", "Epoch 6056/10000, Training Loss: 0.031076611950993538, Test Loss: 0.07992623746395111\n", "Epoch 6057/10000, Training Loss: 0.031069891527295113, Test Loss: 0.07994228601455688\n", "Epoch 6058/10000, Training Loss: 0.031063197180628777, Test Loss: 0.07995937019586563\n", "Epoch 6059/10000, Training Loss: 0.03105650655925274, Test Loss: 0.07997836917638779\n", "Epoch 6060/10000, Training Loss: 0.031049763783812523, Test Loss: 0.07999564707279205\n", "Epoch 6061/10000, Training Loss: 0.031043102964758873, Test Loss: 0.08001239597797394\n", "Epoch 6062/10000, Training Loss: 0.0310363732278347, Test Loss: 0.08003007620573044\n", "Epoch 6063/10000, Training Loss: 0.03102969191968441, Test Loss: 0.08004853874444962\n", "Epoch 6064/10000, Training Loss: 0.03102298080921173, Test Loss: 0.08006572723388672\n", "Epoch 6065/10000, Training Loss: 0.031016269698739052, Test Loss: 0.08008282631635666\n", "Epoch 6066/10000, Training Loss: 0.03100956603884697, Test Loss: 0.08010108768939972\n", "Epoch 6067/10000, Training Loss: 0.03100288100540638, Test Loss: 0.08011869341135025\n", "Epoch 6068/10000, Training Loss: 0.030996209010481834, Test Loss: 0.08013572543859482\n", "Epoch 6069/10000, Training Loss: 0.030989516526460648, Test Loss: 0.08015359938144684\n", "Epoch 6070/10000, Training Loss: 0.030982809141278267, Test Loss: 0.08017167448997498\n", "Epoch 6071/10000, Training Loss: 0.03097616136074066, Test Loss: 0.08018913120031357\n", "Epoch 6072/10000, Training Loss: 0.030969463288784027, Test Loss: 0.0802064836025238\n", "Epoch 6073/10000, Training Loss: 0.030962765216827393, Test Loss: 0.08022481948137283\n", "Epoch 6074/10000, Training Loss: 0.030956074595451355, Test Loss: 0.08024270087480545\n", "Epoch 6075/10000, Training Loss: 0.030949413776397705, Test Loss: 0.08025931566953659\n", "Epoch 6076/10000, Training Loss: 0.030942751094698906, Test Loss: 0.08027730882167816\n", "Epoch 6077/10000, Training Loss: 0.03093605674803257, Test Loss: 0.08029606938362122\n", "Epoch 6078/10000, Training Loss: 0.03092937171459198, Test Loss: 0.08031318336725235\n", "Epoch 6079/10000, Training Loss: 0.03092271089553833, Test Loss: 0.0803302600979805\n", "Epoch 6080/10000, Training Loss: 0.030916007235646248, Test Loss: 0.08034925162792206\n", "Epoch 6081/10000, Training Loss: 0.030909350141882896, Test Loss: 0.0803673267364502\n", "Epoch 6082/10000, Training Loss: 0.030902672559022903, Test Loss: 0.08038381487131119\n", "Epoch 6083/10000, Training Loss: 0.030896060168743134, Test Loss: 0.08040167391300201\n", "Epoch 6084/10000, Training Loss: 0.030889390036463737, Test Loss: 0.0804206132888794\n", "Epoch 6085/10000, Training Loss: 0.0308826956897974, Test Loss: 0.08043775707483292\n", "Epoch 6086/10000, Training Loss: 0.030876051634550095, Test Loss: 0.08045496046543121\n", "Epoch 6087/10000, Training Loss: 0.03086937963962555, Test Loss: 0.08047366142272949\n", "Epoch 6088/10000, Training Loss: 0.03086269460618496, Test Loss: 0.08049191534519196\n", "Epoch 6089/10000, Training Loss: 0.03085608221590519, Test Loss: 0.0805087760090828\n", "Epoch 6090/10000, Training Loss: 0.030849376693367958, Test Loss: 0.08052677661180496\n", "Epoch 6091/10000, Training Loss: 0.030842754989862442, Test Loss: 0.0805458053946495\n", "Epoch 6092/10000, Training Loss: 0.030836060643196106, Test Loss: 0.08056344091892242\n", "Epoch 6093/10000, Training Loss: 0.030829446390271187, Test Loss: 0.08058016747236252\n", "Epoch 6094/10000, Training Loss: 0.03082278184592724, Test Loss: 0.0805986076593399\n", "Epoch 6095/10000, Training Loss: 0.03081609681248665, Test Loss: 0.08061754703521729\n", "Epoch 6096/10000, Training Loss: 0.03080945648252964, Test Loss: 0.08063483983278275\n", "Epoch 6097/10000, Training Loss: 0.03080279380083084, Test Loss: 0.0806519016623497\n", "Epoch 6098/10000, Training Loss: 0.03079618699848652, Test Loss: 0.08067109435796738\n", "Epoch 6099/10000, Training Loss: 0.030789511278271675, Test Loss: 0.08068954199552536\n", "Epoch 6100/10000, Training Loss: 0.030782848596572876, Test Loss: 0.08070596307516098\n", "Epoch 6101/10000, Training Loss: 0.030776266008615494, Test Loss: 0.08072447776794434\n", "Epoch 6102/10000, Training Loss: 0.030769599601626396, Test Loss: 0.08074404299259186\n", "Epoch 6103/10000, Training Loss: 0.030762959271669388, Test Loss: 0.08076100796461105\n", "Epoch 6104/10000, Training Loss: 0.03075633943080902, Test Loss: 0.08077780157327652\n", "Epoch 6105/10000, Training Loss: 0.030749717727303505, Test Loss: 0.08079728484153748\n", "Epoch 6106/10000, Training Loss: 0.030743073672056198, Test Loss: 0.08081606775522232\n", "Epoch 6107/10000, Training Loss: 0.03073648177087307, Test Loss: 0.08083246648311615\n", "Epoch 6108/10000, Training Loss: 0.030729811638593674, Test Loss: 0.08085078001022339\n", "Epoch 6109/10000, Training Loss: 0.030723189935088158, Test Loss: 0.08087006956338882\n", "Epoch 6110/10000, Training Loss: 0.030716586858034134, Test Loss: 0.08088775724172592\n", "Epoch 6111/10000, Training Loss: 0.030709922313690186, Test Loss: 0.0809052437543869\n", "Epoch 6112/10000, Training Loss: 0.030703283846378326, Test Loss: 0.0809241309762001\n", "Epoch 6113/10000, Training Loss: 0.0306966844946146, Test Loss: 0.0809425562620163\n", "Epoch 6114/10000, Training Loss: 0.030690046027302742, Test Loss: 0.08095971494913101\n", "Epoch 6115/10000, Training Loss: 0.030683418735861778, Test Loss: 0.08097803592681885\n", "Epoch 6116/10000, Training Loss: 0.0306768286973238, Test Loss: 0.0809972882270813\n", "Epoch 6117/10000, Training Loss: 0.030670220032334328, Test Loss: 0.08101491630077362\n", "Epoch 6118/10000, Training Loss: 0.03066357597708702, Test Loss: 0.08103227615356445\n", "Epoch 6119/10000, Training Loss: 0.03065701201558113, Test Loss: 0.0810515433549881\n", "Epoch 6120/10000, Training Loss: 0.030650364235043526, Test Loss: 0.08106998354196548\n", "Epoch 6121/10000, Training Loss: 0.030643783509731293, Test Loss: 0.08108757436275482\n", "Epoch 6122/10000, Training Loss: 0.03063715063035488, Test Loss: 0.08110601454973221\n", "Epoch 6123/10000, Training Loss: 0.030630579218268394, Test Loss: 0.0811247006058693\n", "Epoch 6124/10000, Training Loss: 0.030623944476246834, Test Loss: 0.0811426192522049\n", "Epoch 6125/10000, Training Loss: 0.03061731904745102, Test Loss: 0.08116058260202408\n", "Epoch 6126/10000, Training Loss: 0.030610745772719383, Test Loss: 0.08117938786745071\n", "Epoch 6127/10000, Training Loss: 0.030604127794504166, Test Loss: 0.08119785785675049\n", "Epoch 6128/10000, Training Loss: 0.03059752844274044, Test Loss: 0.08121549338102341\n", "Epoch 6129/10000, Training Loss: 0.030590951442718506, Test Loss: 0.08123422414064407\n", "Epoch 6130/10000, Training Loss: 0.03058435767889023, Test Loss: 0.0812528133392334\n", "Epoch 6131/10000, Training Loss: 0.03057774156332016, Test Loss: 0.0812714546918869\n", "Epoch 6132/10000, Training Loss: 0.03057117387652397, Test Loss: 0.08128928393125534\n", "Epoch 6133/10000, Training Loss: 0.03056454099714756, Test Loss: 0.08130782842636108\n", "Epoch 6134/10000, Training Loss: 0.030557960271835327, Test Loss: 0.08132617920637131\n", "Epoch 6135/10000, Training Loss: 0.030551405623555183, Test Loss: 0.08134479075670242\n", "Epoch 6136/10000, Training Loss: 0.03054478019475937, Test Loss: 0.0813630074262619\n", "Epoch 6137/10000, Training Loss: 0.03053823858499527, Test Loss: 0.08138147741556168\n", "Epoch 6138/10000, Training Loss: 0.03053164854645729, Test Loss: 0.08140011131763458\n", "Epoch 6139/10000, Training Loss: 0.03052503988146782, Test Loss: 0.08141862601041794\n", "Epoch 6140/10000, Training Loss: 0.030518442392349243, Test Loss: 0.0814371332526207\n", "Epoch 6141/10000, Training Loss: 0.030511900782585144, Test Loss: 0.08145562559366226\n", "Epoch 6142/10000, Training Loss: 0.030505288392305374, Test Loss: 0.08147387206554413\n", "Epoch 6143/10000, Training Loss: 0.03049875609576702, Test Loss: 0.08149246126413345\n", "Epoch 6144/10000, Training Loss: 0.030492162331938744, Test Loss: 0.08151121437549591\n", "Epoch 6145/10000, Training Loss: 0.0304856039583683, Test Loss: 0.08152969181537628\n", "Epoch 6146/10000, Training Loss: 0.03047901764512062, Test Loss: 0.08154826611280441\n", "Epoch 6147/10000, Training Loss: 0.03047243505716324, Test Loss: 0.08156680315732956\n", "Epoch 6148/10000, Training Loss: 0.030465885996818542, Test Loss: 0.08158507943153381\n", "Epoch 6149/10000, Training Loss: 0.030459318310022354, Test Loss: 0.08160381019115448\n", "Epoch 6150/10000, Training Loss: 0.030452750623226166, Test Loss: 0.08162270486354828\n", "Epoch 6151/10000, Training Loss: 0.030446214601397514, Test Loss: 0.0816410481929779\n", "Epoch 6152/10000, Training Loss: 0.03043961524963379, Test Loss: 0.08165959268808365\n", "Epoch 6153/10000, Training Loss: 0.030433034524321556, Test Loss: 0.08167847990989685\n", "Epoch 6154/10000, Training Loss: 0.03042650781571865, Test Loss: 0.08169740438461304\n", "Epoch 6155/10000, Training Loss: 0.030419912189245224, Test Loss: 0.08171557635068893\n", "Epoch 6156/10000, Training Loss: 0.03041340783238411, Test Loss: 0.08173394203186035\n", "Epoch 6157/10000, Training Loss: 0.030406879261136055, Test Loss: 0.08175334334373474\n", "Epoch 6158/10000, Training Loss: 0.030400272458791733, Test Loss: 0.08177173137664795\n", "Epoch 6159/10000, Training Loss: 0.030393702909350395, Test Loss: 0.08178999274969101\n", "Epoch 6160/10000, Training Loss: 0.0303871538490057, Test Loss: 0.08180937170982361\n", "Epoch 6161/10000, Training Loss: 0.03038063459098339, Test Loss: 0.08182808756828308\n", "Epoch 6162/10000, Training Loss: 0.030374078080058098, Test Loss: 0.08184630423784256\n", "Epoch 6163/10000, Training Loss: 0.030367545783519745, Test Loss: 0.081865593791008\n", "Epoch 6164/10000, Training Loss: 0.030360985547304153, Test Loss: 0.08188457787036896\n", "Epoch 6165/10000, Training Loss: 0.030354445800185204, Test Loss: 0.08190245926380157\n", "Epoch 6166/10000, Training Loss: 0.030347909778356552, Test Loss: 0.08192145079374313\n", "Epoch 6167/10000, Training Loss: 0.0303413737565279, Test Loss: 0.0819408968091011\n", "Epoch 6168/10000, Training Loss: 0.030334819108247757, Test Loss: 0.08195964246988297\n", "Epoch 6169/10000, Training Loss: 0.030328301712870598, Test Loss: 0.08197789639234543\n", "Epoch 6170/10000, Training Loss: 0.030321719124913216, Test Loss: 0.08199674636125565\n", "Epoch 6171/10000, Training Loss: 0.03031526878476143, Test Loss: 0.08201605081558228\n", "Epoch 6172/10000, Training Loss: 0.030308673158288002, Test Loss: 0.08203478902578354\n", "Epoch 6173/10000, Training Loss: 0.03030216135084629, Test Loss: 0.08205319195985794\n", "Epoch 6174/10000, Training Loss: 0.030295660719275475, Test Loss: 0.08207275718450546\n", "Epoch 6175/10000, Training Loss: 0.030289096757769585, Test Loss: 0.08209142833948135\n", "Epoch 6176/10000, Training Loss: 0.03028261847794056, Test Loss: 0.08210938423871994\n", "Epoch 6177/10000, Training Loss: 0.03027607686817646, Test Loss: 0.08212897181510925\n", "Epoch 6178/10000, Training Loss: 0.030269568786025047, Test Loss: 0.08214831352233887\n", "Epoch 6179/10000, Training Loss: 0.030263029038906097, Test Loss: 0.08216623216867447\n", "Epoch 6180/10000, Training Loss: 0.03025650605559349, Test Loss: 0.08218536525964737\n", "Epoch 6181/10000, Training Loss: 0.030249986797571182, Test Loss: 0.08220496773719788\n", "Epoch 6182/10000, Training Loss: 0.03024345636367798, Test Loss: 0.08222363144159317\n", "Epoch 6183/10000, Training Loss: 0.030237004160881042, Test Loss: 0.08224212378263474\n", "Epoch 6184/10000, Training Loss: 0.030230486765503883, Test Loss: 0.08226150274276733\n", "Epoch 6185/10000, Training Loss: 0.03022395446896553, Test Loss: 0.08228112012147903\n", "Epoch 6186/10000, Training Loss: 0.030217457562685013, Test Loss: 0.08229941129684448\n", "Epoch 6187/10000, Training Loss: 0.0302109457552433, Test Loss: 0.0823180228471756\n", "Epoch 6188/10000, Training Loss: 0.030204448848962784, Test Loss: 0.08233782649040222\n", "Epoch 6189/10000, Training Loss: 0.03019792214035988, Test Loss: 0.08235689997673035\n", "Epoch 6190/10000, Training Loss: 0.03019143082201481, Test Loss: 0.08237528800964355\n", "Epoch 6191/10000, Training Loss: 0.030184892937541008, Test Loss: 0.08239487558603287\n", "Epoch 6192/10000, Training Loss: 0.03017842024564743, Test Loss: 0.08241424709558487\n", "Epoch 6193/10000, Training Loss: 0.030171938240528107, Test Loss: 0.08243254572153091\n", "Epoch 6194/10000, Training Loss: 0.030165474861860275, Test Loss: 0.08245175331830978\n", "Epoch 6195/10000, Training Loss: 0.03015894629061222, Test Loss: 0.08247147500514984\n", "Epoch 6196/10000, Training Loss: 0.030152425169944763, Test Loss: 0.08249016106128693\n", "Epoch 6197/10000, Training Loss: 0.03014596365392208, Test Loss: 0.08250898867845535\n", "Epoch 6198/10000, Training Loss: 0.030139494687318802, Test Loss: 0.0825289860367775\n", "Epoch 6199/10000, Training Loss: 0.03013300523161888, Test Loss: 0.08254788815975189\n", "Epoch 6200/10000, Training Loss: 0.03012654557824135, Test Loss: 0.08256624639034271\n", "Epoch 6201/10000, Training Loss: 0.03012002259492874, Test Loss: 0.08258605003356934\n", "Epoch 6202/10000, Training Loss: 0.030113518238067627, Test Loss: 0.08260568231344223\n", "Epoch 6203/10000, Training Loss: 0.03010706976056099, Test Loss: 0.08262427896261215\n", "Epoch 6204/10000, Training Loss: 0.030100587755441666, Test Loss: 0.08264335244894028\n", "Epoch 6205/10000, Training Loss: 0.03009413368999958, Test Loss: 0.08266301453113556\n", "Epoch 6206/10000, Training Loss: 0.030087606981396675, Test Loss: 0.08268246054649353\n", "Epoch 6207/10000, Training Loss: 0.030081138014793396, Test Loss: 0.08270099014043808\n", "Epoch 6208/10000, Training Loss: 0.030074695125222206, Test Loss: 0.08272061496973038\n", "Epoch 6209/10000, Training Loss: 0.030068181455135345, Test Loss: 0.08274003863334656\n", "Epoch 6210/10000, Training Loss: 0.030061759054660797, Test Loss: 0.08275961875915527\n", "Epoch 6211/10000, Training Loss: 0.03005526214838028, Test Loss: 0.08277852833271027\n", "Epoch 6212/10000, Training Loss: 0.03004884347319603, Test Loss: 0.08279752731323242\n", "Epoch 6213/10000, Training Loss: 0.030042320489883423, Test Loss: 0.0828171968460083\n", "Epoch 6214/10000, Training Loss: 0.03003588132560253, Test Loss: 0.08283673971891403\n", "Epoch 6215/10000, Training Loss: 0.03002944029867649, Test Loss: 0.08285561203956604\n", "Epoch 6216/10000, Training Loss: 0.03002297133207321, Test Loss: 0.08287495374679565\n", "Epoch 6217/10000, Training Loss: 0.030016526579856873, Test Loss: 0.08289512246847153\n", "Epoch 6218/10000, Training Loss: 0.030010035261511803, Test Loss: 0.0829140692949295\n", "Epoch 6219/10000, Training Loss: 0.030003592371940613, Test Loss: 0.08293274790048599\n", "Epoch 6220/10000, Training Loss: 0.029997142031788826, Test Loss: 0.08295291662216187\n", "Epoch 6221/10000, Training Loss: 0.02999069169163704, Test Loss: 0.08297266066074371\n", "Epoch 6222/10000, Training Loss: 0.02998424507677555, Test Loss: 0.08299126476049423\n", "Epoch 6223/10000, Training Loss: 0.029977785423398018, Test Loss: 0.0830109566450119\n", "Epoch 6224/10000, Training Loss: 0.029971342533826828, Test Loss: 0.08303120732307434\n", "Epoch 6225/10000, Training Loss: 0.02996489591896534, Test Loss: 0.08304962515830994\n", "Epoch 6226/10000, Training Loss: 0.029958458617329597, Test Loss: 0.08306904137134552\n", "Epoch 6227/10000, Training Loss: 0.029951972886919975, Test Loss: 0.08308945596218109\n", "Epoch 6228/10000, Training Loss: 0.029945554211735725, Test Loss: 0.08310862630605698\n", "Epoch 6229/10000, Training Loss: 0.029939129948616028, Test Loss: 0.08312740921974182\n", "Epoch 6230/10000, Training Loss: 0.02993266098201275, Test Loss: 0.08314792066812515\n", "Epoch 6231/10000, Training Loss: 0.029926229268312454, Test Loss: 0.08316774666309357\n", "Epoch 6232/10000, Training Loss: 0.0299198217689991, Test Loss: 0.08318626880645752\n", "Epoch 6233/10000, Training Loss: 0.02991333045065403, Test Loss: 0.08320588618516922\n", "Epoch 6234/10000, Training Loss: 0.029906928539276123, Test Loss: 0.08322625607252121\n", "Epoch 6235/10000, Training Loss: 0.029900528490543365, Test Loss: 0.08324521780014038\n", "Epoch 6236/10000, Training Loss: 0.029894059523940086, Test Loss: 0.08326458185911179\n", "Epoch 6237/10000, Training Loss: 0.029887625947594643, Test Loss: 0.08328491449356079\n", "Epoch 6238/10000, Training Loss: 0.029881207272410393, Test Loss: 0.08330442756414413\n", "Epoch 6239/10000, Training Loss: 0.029874805361032486, Test Loss: 0.08332355320453644\n", "Epoch 6240/10000, Training Loss: 0.029868368059396744, Test Loss: 0.0833437442779541\n", "Epoch 6241/10000, Training Loss: 0.02986195497214794, Test Loss: 0.08336358517408371\n", "Epoch 6242/10000, Training Loss: 0.029855502769351006, Test Loss: 0.08338283747434616\n", "Epoch 6243/10000, Training Loss: 0.02984911948442459, Test Loss: 0.08340220898389816\n", "Epoch 6244/10000, Training Loss: 0.029842713847756386, Test Loss: 0.0834224596619606\n", "Epoch 6245/10000, Training Loss: 0.029836302623152733, Test Loss: 0.08344196528196335\n", "Epoch 6246/10000, Training Loss: 0.029829908162355423, Test Loss: 0.08346154540777206\n", "Epoch 6247/10000, Training Loss: 0.029823506250977516, Test Loss: 0.08348142355680466\n", "Epoch 6248/10000, Training Loss: 0.029817063361406326, Test Loss: 0.08350127935409546\n", "Epoch 6249/10000, Training Loss: 0.029810650274157524, Test Loss: 0.08352101594209671\n", "Epoch 6250/10000, Training Loss: 0.029804226011037827, Test Loss: 0.08354111015796661\n", "Epoch 6251/10000, Training Loss: 0.029797842726111412, Test Loss: 0.083560511469841\n", "Epoch 6252/10000, Training Loss: 0.029791435226798058, Test Loss: 0.08357985317707062\n", "Epoch 6253/10000, Training Loss: 0.029785048216581345, Test Loss: 0.08360038697719574\n", "Epoch 6254/10000, Training Loss: 0.02977863885462284, Test Loss: 0.08362024277448654\n", "Epoch 6255/10000, Training Loss: 0.029772231355309486, Test Loss: 0.08363940566778183\n", "Epoch 6256/10000, Training Loss: 0.02976583130657673, Test Loss: 0.08365942537784576\n", "Epoch 6257/10000, Training Loss: 0.029759451746940613, Test Loss: 0.08367982506752014\n", "Epoch 6258/10000, Training Loss: 0.0297530684620142, Test Loss: 0.08369924873113632\n", "Epoch 6259/10000, Training Loss: 0.02974664606153965, Test Loss: 0.08371899276971817\n", "Epoch 6260/10000, Training Loss: 0.029740260913968086, Test Loss: 0.08373857289552689\n", "Epoch 6261/10000, Training Loss: 0.029733892530202866, Test Loss: 0.08375871926546097\n", "Epoch 6262/10000, Training Loss: 0.02972749061882496, Test Loss: 0.08377844840288162\n", "Epoch 6263/10000, Training Loss: 0.029721125960350037, Test Loss: 0.08379839360713959\n", "Epoch 6264/10000, Training Loss: 0.029714711010456085, Test Loss: 0.08381840586662292\n", "Epoch 6265/10000, Training Loss: 0.029708368703722954, Test Loss: 0.0838383287191391\n", "Epoch 6266/10000, Training Loss: 0.02970200590789318, Test Loss: 0.08385828882455826\n", "Epoch 6267/10000, Training Loss: 0.029695583507418633, Test Loss: 0.0838785246014595\n", "Epoch 6268/10000, Training Loss: 0.029689181596040726, Test Loss: 0.08389836549758911\n", "Epoch 6269/10000, Training Loss: 0.029682839289307594, Test Loss: 0.0839177593588829\n", "Epoch 6270/10000, Training Loss: 0.029676469042897224, Test Loss: 0.0839383602142334\n", "Epoch 6271/10000, Training Loss: 0.029670070856809616, Test Loss: 0.08395835757255554\n", "Epoch 6272/10000, Training Loss: 0.029663734138011932, Test Loss: 0.08397793769836426\n", "Epoch 6273/10000, Training Loss: 0.029657332226634026, Test Loss: 0.0839984193444252\n", "Epoch 6274/10000, Training Loss: 0.0296509750187397, Test Loss: 0.08401837199926376\n", "Epoch 6275/10000, Training Loss: 0.029644601047039032, Test Loss: 0.08403825759887695\n", "Epoch 6276/10000, Training Loss: 0.0296382624655962, Test Loss: 0.08405807614326477\n", "Epoch 6277/10000, Training Loss: 0.02963189035654068, Test Loss: 0.08407849073410034\n", "Epoch 6278/10000, Training Loss: 0.02962551824748516, Test Loss: 0.0840986892580986\n", "Epoch 6279/10000, Training Loss: 0.02961915358901024, Test Loss: 0.08411851525306702\n", "Epoch 6280/10000, Training Loss: 0.02961280196905136, Test Loss: 0.08413850516080856\n", "Epoch 6281/10000, Training Loss: 0.029606472700834274, Test Loss: 0.08415895700454712\n", "Epoch 6282/10000, Training Loss: 0.029600095003843307, Test Loss: 0.08417925238609314\n", "Epoch 6283/10000, Training Loss: 0.029593756422400475, Test Loss: 0.08419857174158096\n", "Epoch 6284/10000, Training Loss: 0.029587380588054657, Test Loss: 0.08421950787305832\n", "Epoch 6285/10000, Training Loss: 0.02958105318248272, Test Loss: 0.08423949033021927\n", "Epoch 6286/10000, Training Loss: 0.0295746810734272, Test Loss: 0.08425921946763992\n", "Epoch 6287/10000, Training Loss: 0.029568376019597054, Test Loss: 0.08427970111370087\n", "Epoch 6288/10000, Training Loss: 0.029562022536993027, Test Loss: 0.08430004119873047\n", "Epoch 6289/10000, Training Loss: 0.029555639252066612, Test Loss: 0.08432044088840485\n", "Epoch 6290/10000, Training Loss: 0.02954929694533348, Test Loss: 0.08434056490659714\n", "Epoch 6291/10000, Training Loss: 0.029542958363890648, Test Loss: 0.08436008542776108\n", "Epoch 6292/10000, Training Loss: 0.02953663282096386, Test Loss: 0.08438093960285187\n", "Epoch 6293/10000, Training Loss: 0.02953031286597252, Test Loss: 0.08440151065587997\n", "Epoch 6294/10000, Training Loss: 0.029523935168981552, Test Loss: 0.08442104607820511\n", "Epoch 6295/10000, Training Loss: 0.02951761893928051, Test Loss: 0.08444187790155411\n", "Epoch 6296/10000, Training Loss: 0.029511312022805214, Test Loss: 0.08446218073368073\n", "Epoch 6297/10000, Training Loss: 0.029504969716072083, Test Loss: 0.0844823569059372\n", "Epoch 6298/10000, Training Loss: 0.029498649761080742, Test Loss: 0.08450241386890411\n", "Epoch 6299/10000, Training Loss: 0.029492296278476715, Test Loss: 0.08452300727367401\n", "Epoch 6300/10000, Training Loss: 0.029486022889614105, Test Loss: 0.08454351872205734\n", "Epoch 6301/10000, Training Loss: 0.029479652643203735, Test Loss: 0.08456379920244217\n", "Epoch 6302/10000, Training Loss: 0.02947334200143814, Test Loss: 0.08458418399095535\n", "Epoch 6303/10000, Training Loss: 0.02946702390909195, Test Loss: 0.08460406213998795\n", "Epoch 6304/10000, Training Loss: 0.0294607225805521, Test Loss: 0.0846250057220459\n", "Epoch 6305/10000, Training Loss: 0.029454361647367477, Test Loss: 0.08464568853378296\n", "Epoch 6306/10000, Training Loss: 0.029448051005601883, Test Loss: 0.0846654400229454\n", "Epoch 6307/10000, Training Loss: 0.029441747814416885, Test Loss: 0.08468592911958694\n", "Epoch 6308/10000, Training Loss: 0.02943546697497368, Test Loss: 0.08470743149518967\n", "Epoch 6309/10000, Training Loss: 0.029429107904434204, Test Loss: 0.08472710102796555\n", "Epoch 6310/10000, Training Loss: 0.029422855004668236, Test Loss: 0.08474688977003098\n", "Epoch 6311/10000, Training Loss: 0.029416490346193314, Test Loss: 0.08476851135492325\n", "Epoch 6312/10000, Training Loss: 0.02941022254526615, Test Loss: 0.0847889855504036\n", "Epoch 6313/10000, Training Loss: 0.029403893277049065, Test Loss: 0.08480841666460037\n", "Epoch 6314/10000, Training Loss: 0.029397577047348022, Test Loss: 0.08483002334833145\n", "Epoch 6315/10000, Training Loss: 0.02939123474061489, Test Loss: 0.08485071361064911\n", "Epoch 6316/10000, Training Loss: 0.029384955763816833, Test Loss: 0.08487004786729813\n", "Epoch 6317/10000, Training Loss: 0.029378697276115417, Test Loss: 0.08489126712083817\n", "Epoch 6318/10000, Training Loss: 0.02937239035964012, Test Loss: 0.08491270989179611\n", "Epoch 6319/10000, Training Loss: 0.029366053640842438, Test Loss: 0.08493184298276901\n", "Epoch 6320/10000, Training Loss: 0.02935979515314102, Test Loss: 0.08495309948921204\n", "Epoch 6321/10000, Training Loss: 0.029353471472859383, Test Loss: 0.08497466146945953\n", "Epoch 6322/10000, Training Loss: 0.029347239062190056, Test Loss: 0.08499381691217422\n", "Epoch 6323/10000, Training Loss: 0.02934093400835991, Test Loss: 0.08501444011926651\n", "Epoch 6324/10000, Training Loss: 0.029334640130400658, Test Loss: 0.08503659069538116\n", "Epoch 6325/10000, Training Loss: 0.029328349977731705, Test Loss: 0.08505602926015854\n", "Epoch 6326/10000, Training Loss: 0.029322030022740364, Test Loss: 0.08507659286260605\n", "Epoch 6327/10000, Training Loss: 0.029315747320652008, Test Loss: 0.08509836345911026\n", "Epoch 6328/10000, Training Loss: 0.029309511184692383, Test Loss: 0.08511820435523987\n", "Epoch 6329/10000, Training Loss: 0.029303178191184998, Test Loss: 0.08513859659433365\n", "Epoch 6330/10000, Training Loss: 0.02929692342877388, Test Loss: 0.08516053855419159\n", "Epoch 6331/10000, Training Loss: 0.02929067239165306, Test Loss: 0.08518050611019135\n", "Epoch 6332/10000, Training Loss: 0.02928437851369381, Test Loss: 0.08520118147134781\n", "Epoch 6333/10000, Training Loss: 0.029278097674250603, Test Loss: 0.08522260189056396\n", "Epoch 6334/10000, Training Loss: 0.02927182801067829, Test Loss: 0.08524268120527267\n", "Epoch 6335/10000, Training Loss: 0.029265575110912323, Test Loss: 0.08526331931352615\n", "Epoch 6336/10000, Training Loss: 0.029259303584694862, Test Loss: 0.08528482168912888\n", "Epoch 6337/10000, Training Loss: 0.029253019019961357, Test Loss: 0.08530531078577042\n", "Epoch 6338/10000, Training Loss: 0.029246753081679344, Test Loss: 0.08532567322254181\n", "Epoch 6339/10000, Training Loss: 0.029240550473332405, Test Loss: 0.08534763008356094\n", "Epoch 6340/10000, Training Loss: 0.029234211891889572, Test Loss: 0.08536795526742935\n", "Epoch 6341/10000, Training Loss: 0.02922799065709114, Test Loss: 0.08538831025362015\n", "Epoch 6342/10000, Training Loss: 0.029221709817647934, Test Loss: 0.08540966361761093\n", "Epoch 6343/10000, Training Loss: 0.029215462505817413, Test Loss: 0.08543066680431366\n", "Epoch 6344/10000, Training Loss: 0.0292091965675354, Test Loss: 0.08545119315385818\n", "Epoch 6345/10000, Training Loss: 0.029202938079833984, Test Loss: 0.08547205477952957\n", "Epoch 6346/10000, Training Loss: 0.02919669821858406, Test Loss: 0.08549325913190842\n", "Epoch 6347/10000, Training Loss: 0.0291904266923666, Test Loss: 0.08551405370235443\n", "Epoch 6348/10000, Training Loss: 0.029184184968471527, Test Loss: 0.0855354517698288\n", "Epoch 6349/10000, Training Loss: 0.029177948832511902, Test Loss: 0.08555590361356735\n", "Epoch 6350/10000, Training Loss: 0.029171699658036232, Test Loss: 0.08557689189910889\n", "Epoch 6351/10000, Training Loss: 0.029165461659431458, Test Loss: 0.08559808135032654\n", "Epoch 6352/10000, Training Loss: 0.0291591864079237, Test Loss: 0.08561893552541733\n", "Epoch 6353/10000, Training Loss: 0.02915296144783497, Test Loss: 0.08564038574695587\n", "Epoch 6354/10000, Training Loss: 0.029146743938326836, Test Loss: 0.08566126227378845\n", "Epoch 6355/10000, Training Loss: 0.029140472412109375, Test Loss: 0.08568159490823746\n", "Epoch 6356/10000, Training Loss: 0.029134247452020645, Test Loss: 0.08570334315299988\n", "Epoch 6357/10000, Training Loss: 0.02912801131606102, Test Loss: 0.08572455495595932\n", "Epoch 6358/10000, Training Loss: 0.029121793806552887, Test Loss: 0.08574490994215012\n", "Epoch 6359/10000, Training Loss: 0.02911553718149662, Test Loss: 0.08576634526252747\n", "Epoch 6360/10000, Training Loss: 0.029109302908182144, Test Loss: 0.08578792214393616\n", "Epoch 6361/10000, Training Loss: 0.029103098437190056, Test Loss: 0.08580821752548218\n", "Epoch 6362/10000, Training Loss: 0.029096847400069237, Test Loss: 0.08582977950572968\n", "Epoch 6363/10000, Training Loss: 0.029090629890561104, Test Loss: 0.0858510211110115\n", "Epoch 6364/10000, Training Loss: 0.0290844663977623, Test Loss: 0.08587192744016647\n", "Epoch 6365/10000, Training Loss: 0.029078172519803047, Test Loss: 0.08589282631874084\n", "Epoch 6366/10000, Training Loss: 0.029071925207972527, Test Loss: 0.08591460436582565\n", "Epoch 6367/10000, Training Loss: 0.029065735638141632, Test Loss: 0.0859355702996254\n", "Epoch 6368/10000, Training Loss: 0.02905949577689171, Test Loss: 0.0859563872218132\n", "Epoch 6369/10000, Training Loss: 0.02905328758060932, Test Loss: 0.08597829192876816\n", "Epoch 6370/10000, Training Loss: 0.02904709428548813, Test Loss: 0.08599960803985596\n", "Epoch 6371/10000, Training Loss: 0.029040832072496414, Test Loss: 0.08601940423250198\n", "Epoch 6372/10000, Training Loss: 0.029034672304987907, Test Loss: 0.08604192733764648\n", "Epoch 6373/10000, Training Loss: 0.02902846969664097, Test Loss: 0.08606328815221786\n", "Epoch 6374/10000, Training Loss: 0.029022216796875, Test Loss: 0.0860837772488594\n", "Epoch 6375/10000, Training Loss: 0.029016023501753807, Test Loss: 0.08610542118549347\n", "Epoch 6376/10000, Training Loss: 0.02900981903076172, Test Loss: 0.08612711727619171\n", "Epoch 6377/10000, Training Loss: 0.029003623872995377, Test Loss: 0.08614758402109146\n", "Epoch 6378/10000, Training Loss: 0.028997426852583885, Test Loss: 0.08616919815540314\n", "Epoch 6379/10000, Training Loss: 0.028991203755140305, Test Loss: 0.08619111031293869\n", "Epoch 6380/10000, Training Loss: 0.028985029086470604, Test Loss: 0.086211659014225\n", "Epoch 6381/10000, Training Loss: 0.028978833928704262, Test Loss: 0.08623337745666504\n", "Epoch 6382/10000, Training Loss: 0.02897261641919613, Test Loss: 0.08625512570142746\n", "Epoch 6383/10000, Training Loss: 0.028966417536139488, Test Loss: 0.08627589792013168\n", "Epoch 6384/10000, Training Loss: 0.028960201889276505, Test Loss: 0.08629701286554337\n", "Epoch 6385/10000, Training Loss: 0.028954029083251953, Test Loss: 0.08631934225559235\n", "Epoch 6386/10000, Training Loss: 0.028947820886969566, Test Loss: 0.08634036034345627\n", "Epoch 6387/10000, Training Loss: 0.028941676020622253, Test Loss: 0.0863613560795784\n", "Epoch 6388/10000, Training Loss: 0.028935492038726807, Test Loss: 0.08638330549001694\n", "Epoch 6389/10000, Training Loss: 0.028929328545928, Test Loss: 0.08640483021736145\n", "Epoch 6390/10000, Training Loss: 0.02892313338816166, Test Loss: 0.08642613142728806\n", "Epoch 6391/10000, Training Loss: 0.028916893526911736, Test Loss: 0.08644772320985794\n", "Epoch 6392/10000, Training Loss: 0.02891070954501629, Test Loss: 0.08646919578313828\n", "Epoch 6393/10000, Training Loss: 0.028904572129249573, Test Loss: 0.08649005740880966\n", "Epoch 6394/10000, Training Loss: 0.02889840118587017, Test Loss: 0.08651211857795715\n", "Epoch 6395/10000, Training Loss: 0.02889223024249077, Test Loss: 0.08653399348258972\n", "Epoch 6396/10000, Training Loss: 0.028886042535305023, Test Loss: 0.08655460923910141\n", "Epoch 6397/10000, Training Loss: 0.028879862278699875, Test Loss: 0.08657675236463547\n", "Epoch 6398/10000, Training Loss: 0.02887370064854622, Test Loss: 0.08659869432449341\n", "Epoch 6399/10000, Training Loss: 0.028867486864328384, Test Loss: 0.08661963045597076\n", "Epoch 6400/10000, Training Loss: 0.02886134758591652, Test Loss: 0.08664166182279587\n", "Epoch 6401/10000, Training Loss: 0.02885519340634346, Test Loss: 0.0866633877158165\n", "Epoch 6402/10000, Training Loss: 0.02884901687502861, Test Loss: 0.0866842195391655\n", "Epoch 6403/10000, Training Loss: 0.028842821717262268, Test Loss: 0.0867062509059906\n", "Epoch 6404/10000, Training Loss: 0.028836695477366447, Test Loss: 0.08672823011875153\n", "Epoch 6405/10000, Training Loss: 0.028830524533987045, Test Loss: 0.08674930036067963\n", "Epoch 6406/10000, Training Loss: 0.02882435731589794, Test Loss: 0.08677081018686295\n", "Epoch 6407/10000, Training Loss: 0.02881823480129242, Test Loss: 0.08679386228322983\n", "Epoch 6408/10000, Training Loss: 0.02881205826997757, Test Loss: 0.08681459724903107\n", "Epoch 6409/10000, Training Loss: 0.02880590781569481, Test Loss: 0.08683545887470245\n", "Epoch 6410/10000, Training Loss: 0.02879972755908966, Test Loss: 0.08685875684022903\n", "Epoch 6411/10000, Training Loss: 0.028793614357709885, Test Loss: 0.08687993139028549\n", "Epoch 6412/10000, Training Loss: 0.02878745086491108, Test Loss: 0.08690068125724792\n", "Epoch 6413/10000, Training Loss: 0.02878132276237011, Test Loss: 0.08692403882741928\n", "Epoch 6414/10000, Training Loss: 0.028775187209248543, Test Loss: 0.08694534003734589\n", "Epoch 6415/10000, Training Loss: 0.028769006952643394, Test Loss: 0.0869661495089531\n", "Epoch 6416/10000, Training Loss: 0.028762884438037872, Test Loss: 0.08698878437280655\n", "Epoch 6417/10000, Training Loss: 0.02875673957169056, Test Loss: 0.08701049536466599\n", "Epoch 6418/10000, Training Loss: 0.02875060774385929, Test Loss: 0.08703189343214035\n", "Epoch 6419/10000, Training Loss: 0.028744449838995934, Test Loss: 0.08705407381057739\n", "Epoch 6420/10000, Training Loss: 0.028738319873809814, Test Loss: 0.08707620948553085\n", "Epoch 6421/10000, Training Loss: 0.028732195496559143, Test Loss: 0.08709724247455597\n", "Epoch 6422/10000, Training Loss: 0.028726069256663322, Test Loss: 0.08711721003055573\n", "Epoch 6423/10000, Training Loss: 0.028719928115606308, Test Loss: 0.08714160323143005\n", "Epoch 6424/10000, Training Loss: 0.028713807463645935, Test Loss: 0.08716268837451935\n", "Epoch 6425/10000, Training Loss: 0.028707707300782204, Test Loss: 0.08718332648277283\n", "Epoch 6426/10000, Training Loss: 0.0287015438079834, Test Loss: 0.0872069001197815\n", "Epoch 6427/10000, Training Loss: 0.02869545668363571, Test Loss: 0.08722807466983795\n", "Epoch 6428/10000, Training Loss: 0.0286893118172884, Test Loss: 0.08724932372570038\n", "Epoch 6429/10000, Training Loss: 0.028683163225650787, Test Loss: 0.08727220445871353\n", "Epoch 6430/10000, Training Loss: 0.02867705002427101, Test Loss: 0.08729400485754013\n", "Epoch 6431/10000, Training Loss: 0.028670962899923325, Test Loss: 0.08731497079133987\n", "Epoch 6432/10000, Training Loss: 0.028664862737059593, Test Loss: 0.08733800798654556\n", "Epoch 6433/10000, Training Loss: 0.028658732771873474, Test Loss: 0.08735963702201843\n", "Epoch 6434/10000, Training Loss: 0.02865264005959034, Test Loss: 0.08738109469413757\n", "Epoch 6435/10000, Training Loss: 0.028646526858210564, Test Loss: 0.08740372210741043\n", "Epoch 6436/10000, Training Loss: 0.028640372678637505, Test Loss: 0.08742593973875046\n", "Epoch 6437/10000, Training Loss: 0.02863428369164467, Test Loss: 0.08744711428880692\n", "Epoch 6438/10000, Training Loss: 0.028628159314393997, Test Loss: 0.08746951073408127\n", "Epoch 6439/10000, Training Loss: 0.028622059151530266, Test Loss: 0.0874919444322586\n", "Epoch 6440/10000, Training Loss: 0.028615960851311684, Test Loss: 0.08751285821199417\n", "Epoch 6441/10000, Training Loss: 0.028609899803996086, Test Loss: 0.08753544837236404\n", "Epoch 6442/10000, Training Loss: 0.028603769838809967, Test Loss: 0.08755805343389511\n", "Epoch 6443/10000, Training Loss: 0.028597664088010788, Test Loss: 0.08757925778627396\n", "Epoch 6444/10000, Training Loss: 0.028591591864824295, Test Loss: 0.0876014232635498\n", "Epoch 6445/10000, Training Loss: 0.02858549728989601, Test Loss: 0.08762426674365997\n", "Epoch 6446/10000, Training Loss: 0.02857939340174198, Test Loss: 0.08764573186635971\n", "Epoch 6447/10000, Training Loss: 0.028573311865329742, Test Loss: 0.08766734600067139\n", "Epoch 6448/10000, Training Loss: 0.028567206114530563, Test Loss: 0.08769094944000244\n", "Epoch 6449/10000, Training Loss: 0.028561124578118324, Test Loss: 0.08771213889122009\n", "Epoch 6450/10000, Training Loss: 0.028555063530802727, Test Loss: 0.08773382753133774\n", "Epoch 6451/10000, Training Loss: 0.02854892425239086, Test Loss: 0.08775682747364044\n", "Epoch 6452/10000, Training Loss: 0.028542859479784966, Test Loss: 0.08777876198291779\n", "Epoch 6453/10000, Training Loss: 0.028536783531308174, Test Loss: 0.08780041337013245\n", "Epoch 6454/10000, Training Loss: 0.028530696406960487, Test Loss: 0.08782350271940231\n", "Epoch 6455/10000, Training Loss: 0.028524592518806458, Test Loss: 0.08784526586532593\n", "Epoch 6456/10000, Training Loss: 0.028518501669168472, Test Loss: 0.087867371737957\n", "Epoch 6457/10000, Training Loss: 0.02851247414946556, Test Loss: 0.08789002150297165\n", "Epoch 6458/10000, Training Loss: 0.02850639633834362, Test Loss: 0.08791201561689377\n", "Epoch 6459/10000, Training Loss: 0.02850036881864071, Test Loss: 0.08793392777442932\n", "Epoch 6460/10000, Training Loss: 0.028494255617260933, Test Loss: 0.0879567563533783\n", "Epoch 6461/10000, Training Loss: 0.028488190844655037, Test Loss: 0.08797890692949295\n", "Epoch 6462/10000, Training Loss: 0.02848212793469429, Test Loss: 0.08800092339515686\n", "Epoch 6463/10000, Training Loss: 0.028476102277636528, Test Loss: 0.08802356570959091\n", "Epoch 6464/10000, Training Loss: 0.028470003977417946, Test Loss: 0.08804584294557571\n", "Epoch 6465/10000, Training Loss: 0.028463922441005707, Test Loss: 0.08806799352169037\n", "Epoch 6466/10000, Training Loss: 0.0284578837454319, Test Loss: 0.08809085935354233\n", "Epoch 6467/10000, Training Loss: 0.028451817110180855, Test Loss: 0.08811283111572266\n", "Epoch 6468/10000, Training Loss: 0.028445757925510406, Test Loss: 0.08813506364822388\n", "Epoch 6469/10000, Training Loss: 0.028439702466130257, Test Loss: 0.08815767616033554\n", "Epoch 6470/10000, Training Loss: 0.028433609753847122, Test Loss: 0.08817987889051437\n", "Epoch 6471/10000, Training Loss: 0.028427595272660255, Test Loss: 0.08820242434740067\n", "Epoch 6472/10000, Training Loss: 0.028421569615602493, Test Loss: 0.08822496980428696\n", "Epoch 6473/10000, Training Loss: 0.028415504842996597, Test Loss: 0.08824709802865982\n", "Epoch 6474/10000, Training Loss: 0.028409443795681, Test Loss: 0.08826977014541626\n", "Epoch 6475/10000, Training Loss: 0.028403373435139656, Test Loss: 0.08829287439584732\n", "Epoch 6476/10000, Training Loss: 0.028397371992468834, Test Loss: 0.08831453323364258\n", "Epoch 6477/10000, Training Loss: 0.028391262516379356, Test Loss: 0.08833721280097961\n", "Epoch 6478/10000, Training Loss: 0.028385266661643982, Test Loss: 0.08836005628108978\n", "Epoch 6479/10000, Training Loss: 0.02837924286723137, Test Loss: 0.08838241547346115\n", "Epoch 6480/10000, Training Loss: 0.028373142704367638, Test Loss: 0.0884043350815773\n", "Epoch 6481/10000, Training Loss: 0.028367141261696815, Test Loss: 0.0884278416633606\n", "Epoch 6482/10000, Training Loss: 0.028361108154058456, Test Loss: 0.08845005929470062\n", "Epoch 6483/10000, Training Loss: 0.028355050832033157, Test Loss: 0.08847204595804214\n", "Epoch 6484/10000, Training Loss: 0.028349071741104126, Test Loss: 0.08849599957466125\n", "Epoch 6485/10000, Training Loss: 0.028342999517917633, Test Loss: 0.0885176733136177\n", "Epoch 6486/10000, Training Loss: 0.02833699621260166, Test Loss: 0.0885402262210846\n", "Epoch 6487/10000, Training Loss: 0.028330955654382706, Test Loss: 0.08856305480003357\n", "Epoch 6488/10000, Training Loss: 0.02832494117319584, Test Loss: 0.08858562260866165\n", "Epoch 6489/10000, Training Loss: 0.02831890806555748, Test Loss: 0.08860820531845093\n", "Epoch 6490/10000, Training Loss: 0.028312871232628822, Test Loss: 0.08863126486539841\n", "Epoch 6491/10000, Training Loss: 0.02830684930086136, Test Loss: 0.08865351974964142\n", "Epoch 6492/10000, Training Loss: 0.02830084227025509, Test Loss: 0.08867635577917099\n", "Epoch 6493/10000, Training Loss: 0.028294852003455162, Test Loss: 0.08869966864585876\n", "Epoch 6494/10000, Training Loss: 0.028288813307881355, Test Loss: 0.08872184157371521\n", "Epoch 6495/10000, Training Loss: 0.028282782062888145, Test Loss: 0.08874452114105225\n", "Epoch 6496/10000, Training Loss: 0.02827676758170128, Test Loss: 0.08876758813858032\n", "Epoch 6497/10000, Training Loss: 0.028270792216062546, Test Loss: 0.08878976106643677\n", "Epoch 6498/10000, Training Loss: 0.028264757245779037, Test Loss: 0.08881288021802902\n", "Epoch 6499/10000, Training Loss: 0.028258729726076126, Test Loss: 0.08883608132600784\n", "Epoch 6500/10000, Training Loss: 0.028252769261598587, Test Loss: 0.08885814994573593\n", "Epoch 6501/10000, Training Loss: 0.02824675850570202, Test Loss: 0.08888126909732819\n", "Epoch 6502/10000, Training Loss: 0.028240716084837914, Test Loss: 0.0889049619436264\n", "Epoch 6503/10000, Training Loss: 0.028234748169779778, Test Loss: 0.08892638236284256\n", "Epoch 6504/10000, Training Loss: 0.028228791430592537, Test Loss: 0.0889502763748169\n", "Epoch 6505/10000, Training Loss: 0.028222713619470596, Test Loss: 0.08897281438112259\n", "Epoch 6506/10000, Training Loss: 0.028216727077960968, Test Loss: 0.08899514377117157\n", "Epoch 6507/10000, Training Loss: 0.02821076102554798, Test Loss: 0.08901903033256531\n", "Epoch 6508/10000, Training Loss: 0.028204742819070816, Test Loss: 0.0890413448214531\n", "Epoch 6509/10000, Training Loss: 0.02819874696433544, Test Loss: 0.08906397223472595\n", "Epoch 6510/10000, Training Loss: 0.028192758560180664, Test Loss: 0.0890875980257988\n", "Epoch 6511/10000, Training Loss: 0.028186792507767677, Test Loss: 0.08911090344190598\n", "Epoch 6512/10000, Training Loss: 0.02818077802658081, Test Loss: 0.08913268893957138\n", "Epoch 6513/10000, Training Loss: 0.02817479707300663, Test Loss: 0.08915649354457855\n", "Epoch 6514/10000, Training Loss: 0.0281688142567873, Test Loss: 0.08917941153049469\n", "Epoch 6515/10000, Training Loss: 0.028162848204374313, Test Loss: 0.0892016738653183\n", "Epoch 6516/10000, Training Loss: 0.028156884014606476, Test Loss: 0.08922554552555084\n", "Epoch 6517/10000, Training Loss: 0.02815084718167782, Test Loss: 0.08924831449985504\n", "Epoch 6518/10000, Training Loss: 0.028144899755716324, Test Loss: 0.08927120268344879\n", "Epoch 6519/10000, Training Loss: 0.028138911351561546, Test Loss: 0.08929461240768433\n", "Epoch 6520/10000, Training Loss: 0.02813294343650341, Test Loss: 0.08931772410869598\n", "Epoch 6521/10000, Training Loss: 0.028126971796154976, Test Loss: 0.08934005349874496\n", "Epoch 6522/10000, Training Loss: 0.028120988979935646, Test Loss: 0.08936398476362228\n", "Epoch 6523/10000, Training Loss: 0.028115034103393555, Test Loss: 0.08938682824373245\n", "Epoch 6524/10000, Training Loss: 0.02810901403427124, Test Loss: 0.08940970152616501\n", "Epoch 6525/10000, Training Loss: 0.028103090822696686, Test Loss: 0.08943361788988113\n", "Epoch 6526/10000, Training Loss: 0.0280971210449934, Test Loss: 0.08945602178573608\n", "Epoch 6527/10000, Training Loss: 0.028091147541999817, Test Loss: 0.08947908133268356\n", "Epoch 6528/10000, Training Loss: 0.02808515913784504, Test Loss: 0.0895029604434967\n", "Epoch 6529/10000, Training Loss: 0.02807922288775444, Test Loss: 0.08952553570270538\n", "Epoch 6530/10000, Training Loss: 0.028073221445083618, Test Loss: 0.08954927325248718\n", "Epoch 6531/10000, Training Loss: 0.028067301958799362, Test Loss: 0.08957205712795258\n", "Epoch 6532/10000, Training Loss: 0.028061358258128166, Test Loss: 0.08959528058767319\n", "Epoch 6533/10000, Training Loss: 0.028055397793650627, Test Loss: 0.08961945027112961\n", "Epoch 6534/10000, Training Loss: 0.028049428015947342, Test Loss: 0.08964167535305023\n", "Epoch 6535/10000, Training Loss: 0.02804349921643734, Test Loss: 0.08966474235057831\n", "Epoch 6536/10000, Training Loss: 0.028037529438734055, Test Loss: 0.08968926966190338\n", "Epoch 6537/10000, Training Loss: 0.028031613677740097, Test Loss: 0.08971147984266281\n", "Epoch 6538/10000, Training Loss: 0.028025632724165916, Test Loss: 0.08973457664251328\n", "Epoch 6539/10000, Training Loss: 0.028019756078720093, Test Loss: 0.08975915610790253\n", "Epoch 6540/10000, Training Loss: 0.028013791888952255, Test Loss: 0.0897819921374321\n", "Epoch 6541/10000, Training Loss: 0.028007803484797478, Test Loss: 0.0898045003414154\n", "Epoch 6542/10000, Training Loss: 0.02800186164677143, Test Loss: 0.08982943743467331\n", "Epoch 6543/10000, Training Loss: 0.027995973825454712, Test Loss: 0.0898514911532402\n", "Epoch 6544/10000, Training Loss: 0.027989985421299934, Test Loss: 0.089874766767025\n", "Epoch 6545/10000, Training Loss: 0.027984047308564186, Test Loss: 0.08989959955215454\n", "Epoch 6546/10000, Training Loss: 0.027978133410215378, Test Loss: 0.08992169797420502\n", "Epoch 6547/10000, Training Loss: 0.02797221764922142, Test Loss: 0.08994486182928085\n", "Epoch 6548/10000, Training Loss: 0.027966255322098732, Test Loss: 0.08996995538473129\n", "Epoch 6549/10000, Training Loss: 0.027960337698459625, Test Loss: 0.08999231457710266\n", "Epoch 6550/10000, Training Loss: 0.027954429388046265, Test Loss: 0.09001535922288895\n", "Epoch 6551/10000, Training Loss: 0.027948500588536263, Test Loss: 0.09004023671150208\n", "Epoch 6552/10000, Training Loss: 0.02794252336025238, Test Loss: 0.09006215631961823\n", "Epoch 6553/10000, Training Loss: 0.027936652302742004, Test Loss: 0.09008633345365524\n", "Epoch 6554/10000, Training Loss: 0.027930736541748047, Test Loss: 0.09011074155569077\n", "Epoch 6555/10000, Training Loss: 0.027924789115786552, Test Loss: 0.09013265371322632\n", "Epoch 6556/10000, Training Loss: 0.027918873354792595, Test Loss: 0.090157151222229\n", "Epoch 6557/10000, Training Loss: 0.027912922203540802, Test Loss: 0.09018111974000931\n", "Epoch 6558/10000, Training Loss: 0.027906998991966248, Test Loss: 0.09020373970270157\n", "Epoch 6559/10000, Training Loss: 0.027901146560907364, Test Loss: 0.09022814780473709\n", "Epoch 6560/10000, Training Loss: 0.02789522148668766, Test Loss: 0.09025169163942337\n", "Epoch 6561/10000, Training Loss: 0.027889283373951912, Test Loss: 0.09027421474456787\n", "Epoch 6562/10000, Training Loss: 0.027883358299732208, Test Loss: 0.09029939025640488\n", "Epoch 6563/10000, Training Loss: 0.02787747047841549, Test Loss: 0.0903221070766449\n", "Epoch 6564/10000, Training Loss: 0.027871618047356606, Test Loss: 0.09034552425146103\n", "Epoch 6565/10000, Training Loss: 0.027865661308169365, Test Loss: 0.09037017822265625\n", "Epoch 6566/10000, Training Loss: 0.02785974182188511, Test Loss: 0.09039344638586044\n", "Epoch 6567/10000, Training Loss: 0.027853894978761673, Test Loss: 0.09041688591241837\n", "Epoch 6568/10000, Training Loss: 0.027847982943058014, Test Loss: 0.09044069796800613\n", "Epoch 6569/10000, Training Loss: 0.0278420802205801, Test Loss: 0.09046465903520584\n", "Epoch 6570/10000, Training Loss: 0.027836179360747337, Test Loss: 0.09048853069543839\n", "Epoch 6571/10000, Training Loss: 0.027830295264720917, Test Loss: 0.09051138907670975\n", "Epoch 6572/10000, Training Loss: 0.02782437391579151, Test Loss: 0.09053588658571243\n", "Epoch 6573/10000, Training Loss: 0.02781851775944233, Test Loss: 0.09055964648723602\n", "Epoch 6574/10000, Training Loss: 0.027812594547867775, Test Loss: 0.0905832052230835\n", "Epoch 6575/10000, Training Loss: 0.02780672162771225, Test Loss: 0.09060724824666977\n", "Epoch 6576/10000, Training Loss: 0.02780083380639553, Test Loss: 0.09063054621219635\n", "Epoch 6577/10000, Training Loss: 0.02779494598507881, Test Loss: 0.09065496176481247\n", "Epoch 6578/10000, Training Loss: 0.027789045125246048, Test Loss: 0.09067834913730621\n", "Epoch 6579/10000, Training Loss: 0.02778315171599388, Test Loss: 0.09070233255624771\n", "Epoch 6580/10000, Training Loss: 0.02777731977403164, Test Loss: 0.09072704613208771\n", "Epoch 6581/10000, Training Loss: 0.027771441265940666, Test Loss: 0.09074993431568146\n", "Epoch 6582/10000, Training Loss: 0.027765531092882156, Test Loss: 0.09077376127243042\n", "Epoch 6583/10000, Training Loss: 0.027759656310081482, Test Loss: 0.09079870581626892\n", "Epoch 6584/10000, Training Loss: 0.02775382064282894, Test Loss: 0.09082137048244476\n", "Epoch 6585/10000, Training Loss: 0.02774791046977043, Test Loss: 0.09084596484899521\n", "Epoch 6586/10000, Training Loss: 0.02774203009903431, Test Loss: 0.09087046980857849\n", "Epoch 6587/10000, Training Loss: 0.02773617021739483, Test Loss: 0.09089325368404388\n", "Epoch 6588/10000, Training Loss: 0.0277303084731102, Test Loss: 0.09091830253601074\n", "Epoch 6589/10000, Training Loss: 0.02772442251443863, Test Loss: 0.09094160050153732\n", "Epoch 6590/10000, Training Loss: 0.027718560770154, Test Loss: 0.09096556901931763\n", "Epoch 6591/10000, Training Loss: 0.027712732553482056, Test Loss: 0.09099027514457703\n", "Epoch 6592/10000, Training Loss: 0.027706844732165337, Test Loss: 0.09101346135139465\n", "Epoch 6593/10000, Training Loss: 0.027700981125235558, Test Loss: 0.09103798121213913\n", "Epoch 6594/10000, Training Loss: 0.027695128694176674, Test Loss: 0.09106211364269257\n", "Epoch 6595/10000, Training Loss: 0.027689294889569283, Test Loss: 0.09108567237854004\n", "Epoch 6596/10000, Training Loss: 0.027683397755026817, Test Loss: 0.09111056476831436\n", "Epoch 6597/10000, Training Loss: 0.02767755463719368, Test Loss: 0.09113461524248123\n", "Epoch 6598/10000, Training Loss: 0.02767166681587696, Test Loss: 0.09115787595510483\n", "Epoch 6599/10000, Training Loss: 0.027665844187140465, Test Loss: 0.09118311107158661\n", "Epoch 6600/10000, Training Loss: 0.027660036459565163, Test Loss: 0.09120611101388931\n", "Epoch 6601/10000, Training Loss: 0.027654174715280533, Test Loss: 0.09123087674379349\n", "Epoch 6602/10000, Training Loss: 0.02764832228422165, Test Loss: 0.09125520288944244\n", "Epoch 6603/10000, Training Loss: 0.027642443776130676, Test Loss: 0.09127842634916306\n", "Epoch 6604/10000, Training Loss: 0.02763661928474903, Test Loss: 0.09130389243364334\n", "Epoch 6605/10000, Training Loss: 0.027630792930722237, Test Loss: 0.09132755547761917\n", "Epoch 6606/10000, Training Loss: 0.02762489952147007, Test Loss: 0.091351218521595\n", "Epoch 6607/10000, Training Loss: 0.027619095519185066, Test Loss: 0.09137709438800812\n", "Epoch 6608/10000, Training Loss: 0.02761327475309372, Test Loss: 0.09139995276927948\n", "Epoch 6609/10000, Training Loss: 0.027607453987002373, Test Loss: 0.09142425656318665\n", "Epoch 6610/10000, Training Loss: 0.027601614594459534, Test Loss: 0.09144965559244156\n", "Epoch 6611/10000, Training Loss: 0.027595780789852142, Test Loss: 0.09147253632545471\n", "Epoch 6612/10000, Training Loss: 0.02758992835879326, Test Loss: 0.09149705618619919\n", "Epoch 6613/10000, Training Loss: 0.027584103867411613, Test Loss: 0.0915224701166153\n", "Epoch 6614/10000, Training Loss: 0.027578264474868774, Test Loss: 0.09154573827981949\n", "Epoch 6615/10000, Training Loss: 0.027572453022003174, Test Loss: 0.09157056361436844\n", "Epoch 6616/10000, Training Loss: 0.02756662108004093, Test Loss: 0.09159494191408157\n", "Epoch 6617/10000, Training Loss: 0.02756076492369175, Test Loss: 0.09161880612373352\n", "Epoch 6618/10000, Training Loss: 0.02755502425134182, Test Loss: 0.09164369106292725\n", "Epoch 6619/10000, Training Loss: 0.02754916436970234, Test Loss: 0.09166804701089859\n", "Epoch 6620/10000, Training Loss: 0.0275433287024498, Test Loss: 0.09169243276119232\n", "Epoch 6621/10000, Training Loss: 0.02753753960132599, Test Loss: 0.09171685576438904\n", "Epoch 6622/10000, Training Loss: 0.02753172256052494, Test Loss: 0.09174107760190964\n", "Epoch 6623/10000, Training Loss: 0.027525918558239937, Test Loss: 0.09176573157310486\n", "Epoch 6624/10000, Training Loss: 0.02752009406685829, Test Loss: 0.09178993850946426\n", "Epoch 6625/10000, Training Loss: 0.027514278888702393, Test Loss: 0.0918150320649147\n", "Epoch 6626/10000, Training Loss: 0.027508459985256195, Test Loss: 0.0918390303850174\n", "Epoch 6627/10000, Training Loss: 0.027502644807100296, Test Loss: 0.09186321496963501\n", "Epoch 6628/10000, Training Loss: 0.027496859431266785, Test Loss: 0.09188883751630783\n", "Epoch 6629/10000, Training Loss: 0.027491070330142975, Test Loss: 0.09191233664751053\n", "Epoch 6630/10000, Training Loss: 0.027485249564051628, Test Loss: 0.09193690121173859\n", "Epoch 6631/10000, Training Loss: 0.027479419484734535, Test Loss: 0.09196283668279648\n", "Epoch 6632/10000, Training Loss: 0.027473678812384605, Test Loss: 0.09198572486639023\n", "Epoch 6633/10000, Training Loss: 0.02746787667274475, Test Loss: 0.09201129525899887\n", "Epoch 6634/10000, Training Loss: 0.027462076395750046, Test Loss: 0.09203626960515976\n", "Epoch 6635/10000, Training Loss: 0.027456270530819893, Test Loss: 0.09205898642539978\n", "Epoch 6636/10000, Training Loss: 0.027450453490018845, Test Loss: 0.09208568185567856\n", "Epoch 6637/10000, Training Loss: 0.02744467183947563, Test Loss: 0.0921095684170723\n", "Epoch 6638/10000, Training Loss: 0.027438869699835777, Test Loss: 0.0921330526471138\n", "Epoch 6639/10000, Training Loss: 0.027433106675744057, Test Loss: 0.09216015785932541\n", "Epoch 6640/10000, Training Loss: 0.02742728590965271, Test Loss: 0.09218304604291916\n", "Epoch 6641/10000, Training Loss: 0.027421511709690094, Test Loss: 0.09220801293849945\n", "Epoch 6642/10000, Training Loss: 0.02741573564708233, Test Loss: 0.09223402291536331\n", "Epoch 6643/10000, Training Loss: 0.02740994468331337, Test Loss: 0.09225640445947647\n", "Epoch 6644/10000, Training Loss: 0.027404149994254112, Test Loss: 0.09228284657001495\n", "Epoch 6645/10000, Training Loss: 0.027398396283388138, Test Loss: 0.09230785071849823\n", "Epoch 6646/10000, Training Loss: 0.02739259973168373, Test Loss: 0.09233064949512482\n", "Epoch 6647/10000, Training Loss: 0.027386853471398354, Test Loss: 0.09235747903585434\n", "Epoch 6648/10000, Training Loss: 0.027381056919693947, Test Loss: 0.09238189458847046\n", "Epoch 6649/10000, Training Loss: 0.02737528458237648, Test Loss: 0.09240540117025375\n", "Epoch 6650/10000, Training Loss: 0.027369482442736626, Test Loss: 0.09243248403072357\n", "Epoch 6651/10000, Training Loss: 0.027363764122128487, Test Loss: 0.09245439618825912\n", "Epoch 6652/10000, Training Loss: 0.027357978746294975, Test Loss: 0.09248016029596329\n", "Epoch 6653/10000, Training Loss: 0.027352185919880867, Test Loss: 0.09250631928443909\n", "Epoch 6654/10000, Training Loss: 0.027346454560756683, Test Loss: 0.09252896904945374\n", "Epoch 6655/10000, Training Loss: 0.027340710163116455, Test Loss: 0.09255526959896088\n", "Epoch 6656/10000, Training Loss: 0.027334915474057198, Test Loss: 0.09258003532886505\n", "Epoch 6657/10000, Training Loss: 0.02732914686203003, Test Loss: 0.09260416030883789\n", "Epoch 6658/10000, Training Loss: 0.027323372662067413, Test Loss: 0.09263003617525101\n", "Epoch 6659/10000, Training Loss: 0.02731768973171711, Test Loss: 0.09265419840812683\n", "Epoch 6660/10000, Training Loss: 0.027311913669109344, Test Loss: 0.09267903119325638\n", "Epoch 6661/10000, Training Loss: 0.027306171134114265, Test Loss: 0.09270433336496353\n", "Epoch 6662/10000, Training Loss: 0.027300385758280754, Test Loss: 0.09272842854261398\n", "Epoch 6663/10000, Training Loss: 0.02729465253651142, Test Loss: 0.09275434911251068\n", "Epoch 6664/10000, Training Loss: 0.02728891372680664, Test Loss: 0.09277883917093277\n", "Epoch 6665/10000, Training Loss: 0.027283141389489174, Test Loss: 0.09280336648225784\n", "Epoch 6666/10000, Training Loss: 0.027277421206235886, Test Loss: 0.09282921254634857\n", "Epoch 6667/10000, Training Loss: 0.027271680533885956, Test Loss: 0.09285366535186768\n", "Epoch 6668/10000, Training Loss: 0.02726595290005207, Test Loss: 0.0928785428404808\n", "Epoch 6669/10000, Training Loss: 0.027260182425379753, Test Loss: 0.09290418773889542\n", "Epoch 6670/10000, Training Loss: 0.027254441753029823, Test Loss: 0.09292791783809662\n", "Epoch 6671/10000, Training Loss: 0.027248693630099297, Test Loss: 0.09295415133237839\n", "Epoch 6672/10000, Training Loss: 0.02724296972155571, Test Loss: 0.0929785966873169\n", "Epoch 6673/10000, Training Loss: 0.02723727561533451, Test Loss: 0.09300349652767181\n", "Epoch 6674/10000, Training Loss: 0.027231503278017044, Test Loss: 0.09302926063537598\n", "Epoch 6675/10000, Training Loss: 0.027225786820054054, Test Loss: 0.09305329620838165\n", "Epoch 6676/10000, Training Loss: 0.02722003683447838, Test Loss: 0.09307954460382462\n", "Epoch 6677/10000, Training Loss: 0.027214307337999344, Test Loss: 0.09310401231050491\n", "Epoch 6678/10000, Training Loss: 0.027208589017391205, Test Loss: 0.09312860667705536\n", "Epoch 6679/10000, Training Loss: 0.027202866971492767, Test Loss: 0.09315486997365952\n", "Epoch 6680/10000, Training Loss: 0.027197154238820076, Test Loss: 0.0931791141629219\n", "Epoch 6681/10000, Training Loss: 0.027191422879695892, Test Loss: 0.09320441633462906\n", "Epoch 6682/10000, Training Loss: 0.027185726910829544, Test Loss: 0.09323026984930038\n", "Epoch 6683/10000, Training Loss: 0.02717999368906021, Test Loss: 0.0932546928524971\n", "Epoch 6684/10000, Training Loss: 0.02717430703341961, Test Loss: 0.09328053146600723\n", "Epoch 6685/10000, Training Loss: 0.027168549597263336, Test Loss: 0.09330524504184723\n", "Epoch 6686/10000, Training Loss: 0.027162838727235794, Test Loss: 0.09333058446645737\n", "Epoch 6687/10000, Training Loss: 0.02715710923075676, Test Loss: 0.09335599839687347\n", "Epoch 6688/10000, Training Loss: 0.027151405811309814, Test Loss: 0.09338105469942093\n", "Epoch 6689/10000, Training Loss: 0.027145709842443466, Test Loss: 0.09340669214725494\n", "Epoch 6690/10000, Training Loss: 0.027140004560351372, Test Loss: 0.09343213587999344\n", "Epoch 6691/10000, Training Loss: 0.02713431417942047, Test Loss: 0.09345665574073792\n", "Epoch 6692/10000, Training Loss: 0.027128593996167183, Test Loss: 0.09348250180482864\n", "Epoch 6693/10000, Training Loss: 0.027122871950268745, Test Loss: 0.09350841492414474\n", "Epoch 6694/10000, Training Loss: 0.027117235586047173, Test Loss: 0.09353210777044296\n", "Epoch 6695/10000, Training Loss: 0.027111459523439407, Test Loss: 0.09355948865413666\n", "Epoch 6696/10000, Training Loss: 0.027105804532766342, Test Loss: 0.0935835987329483\n", "Epoch 6697/10000, Training Loss: 0.02710011787712574, Test Loss: 0.09360884130001068\n", "Epoch 6698/10000, Training Loss: 0.02709440514445305, Test Loss: 0.09363614767789841\n", "Epoch 6699/10000, Training Loss: 0.027088722214102745, Test Loss: 0.09365859627723694\n", "Epoch 6700/10000, Training Loss: 0.027083011344075203, Test Loss: 0.09368646144866943\n", "Epoch 6701/10000, Training Loss: 0.02707735076546669, Test Loss: 0.09371139854192734\n", "Epoch 6702/10000, Training Loss: 0.027071638032794, Test Loss: 0.09373574703931808\n", "Epoch 6703/10000, Training Loss: 0.02706596627831459, Test Loss: 0.09376320242881775\n", "Epoch 6704/10000, Training Loss: 0.02706029824912548, Test Loss: 0.09378672391176224\n", "Epoch 6705/10000, Training Loss: 0.027054615318775177, Test Loss: 0.09381382912397385\n", "Epoch 6706/10000, Training Loss: 0.02704891748726368, Test Loss: 0.09383905678987503\n", "Epoch 6707/10000, Training Loss: 0.027043238282203674, Test Loss: 0.09386324137449265\n", "Epoch 6708/10000, Training Loss: 0.02703753486275673, Test Loss: 0.09389103204011917\n", "Epoch 6709/10000, Training Loss: 0.02703184448182583, Test Loss: 0.09391465783119202\n", "Epoch 6710/10000, Training Loss: 0.02702617272734642, Test Loss: 0.09394080936908722\n", "Epoch 6711/10000, Training Loss: 0.027020499110221863, Test Loss: 0.0939677432179451\n", "Epoch 6712/10000, Training Loss: 0.027014853432774544, Test Loss: 0.0939912348985672\n", "Epoch 6713/10000, Training Loss: 0.02700916863977909, Test Loss: 0.09401893615722656\n", "Epoch 6714/10000, Training Loss: 0.027003498747944832, Test Loss: 0.09404312819242477\n", "Epoch 6715/10000, Training Loss: 0.026997795328497887, Test Loss: 0.09406913071870804\n", "Epoch 6716/10000, Training Loss: 0.02699217200279236, Test Loss: 0.09409558773040771\n", "Epoch 6717/10000, Training Loss: 0.026986518874764442, Test Loss: 0.09412047266960144\n", "Epoch 6718/10000, Training Loss: 0.026980822905898094, Test Loss: 0.0941462516784668\n", "Epoch 6719/10000, Training Loss: 0.026975160464644432, Test Loss: 0.09417274594306946\n", "Epoch 6720/10000, Training Loss: 0.026969509199261665, Test Loss: 0.09419737011194229\n", "Epoch 6721/10000, Training Loss: 0.026963872835040092, Test Loss: 0.09422414004802704\n", "Epoch 6722/10000, Training Loss: 0.02695821411907673, Test Loss: 0.09424939751625061\n", "Epoch 6723/10000, Training Loss: 0.02695249207317829, Test Loss: 0.09427548199892044\n", "Epoch 6724/10000, Training Loss: 0.026946863159537315, Test Loss: 0.09430127590894699\n", "Epoch 6725/10000, Training Loss: 0.026941213756799698, Test Loss: 0.09432639926671982\n", "Epoch 6726/10000, Training Loss: 0.026935532689094543, Test Loss: 0.09435334801673889\n", "Epoch 6727/10000, Training Loss: 0.026929941028356552, Test Loss: 0.09437823295593262\n", "Epoch 6728/10000, Training Loss: 0.02692427672445774, Test Loss: 0.09440461546182632\n", "Epoch 6729/10000, Training Loss: 0.02691863290965557, Test Loss: 0.09443061798810959\n", "Epoch 6730/10000, Training Loss: 0.026912998408079147, Test Loss: 0.09445592015981674\n", "Epoch 6731/10000, Training Loss: 0.026907339692115784, Test Loss: 0.09448253363370895\n", "Epoch 6732/10000, Training Loss: 0.02690168470144272, Test Loss: 0.09450853615999222\n", "Epoch 6733/10000, Training Loss: 0.0268960352987051, Test Loss: 0.09453358501195908\n", "Epoch 6734/10000, Training Loss: 0.026890402659773827, Test Loss: 0.09456030279397964\n", "Epoch 6735/10000, Training Loss: 0.026884745806455612, Test Loss: 0.09458599984645844\n", "Epoch 6736/10000, Training Loss: 0.02687913179397583, Test Loss: 0.0946119874715805\n", "Epoch 6737/10000, Training Loss: 0.026873525232076645, Test Loss: 0.09463804960250854\n", "Epoch 6738/10000, Training Loss: 0.026867862790822983, Test Loss: 0.09466424584388733\n", "Epoch 6739/10000, Training Loss: 0.0268622487783432, Test Loss: 0.09468972682952881\n", "Epoch 6740/10000, Training Loss: 0.026856590062379837, Test Loss: 0.09471656382083893\n", "Epoch 6741/10000, Training Loss: 0.02685096673667431, Test Loss: 0.0947418361902237\n", "Epoch 6742/10000, Training Loss: 0.026845328509807587, Test Loss: 0.09476812183856964\n", "Epoch 6743/10000, Training Loss: 0.026839692145586014, Test Loss: 0.09479478001594543\n", "Epoch 6744/10000, Training Loss: 0.026834091171622276, Test Loss: 0.094820536673069\n", "Epoch 6745/10000, Training Loss: 0.026828456670045853, Test Loss: 0.09484634548425674\n", "Epoch 6746/10000, Training Loss: 0.026822853833436966, Test Loss: 0.09487278014421463\n", "Epoch 6747/10000, Training Loss: 0.026817232370376587, Test Loss: 0.09489899128675461\n", "Epoch 6748/10000, Training Loss: 0.02681160531938076, Test Loss: 0.09492459148168564\n", "Epoch 6749/10000, Training Loss: 0.026806004345417023, Test Loss: 0.09495174139738083\n", "Epoch 6750/10000, Training Loss: 0.026800381019711494, Test Loss: 0.09497687965631485\n", "Epoch 6751/10000, Training Loss: 0.026794789358973503, Test Loss: 0.09500373154878616\n", "Epoch 6752/10000, Training Loss: 0.026789147406816483, Test Loss: 0.09502975642681122\n", "Epoch 6753/10000, Training Loss: 0.026783538982272148, Test Loss: 0.09505587071180344\n", "Epoch 6754/10000, Training Loss: 0.026777945458889008, Test Loss: 0.0950828269124031\n", "Epoch 6755/10000, Training Loss: 0.026772314682602882, Test Loss: 0.09510837495326996\n", "Epoch 6756/10000, Training Loss: 0.02676672860980034, Test Loss: 0.09513440728187561\n", "Epoch 6757/10000, Training Loss: 0.02676110342144966, Test Loss: 0.09516187012195587\n", "Epoch 6758/10000, Training Loss: 0.026755504310131073, Test Loss: 0.09518660604953766\n", "Epoch 6759/10000, Training Loss: 0.02674989402294159, Test Loss: 0.09521440416574478\n", "Epoch 6760/10000, Training Loss: 0.0267442986369133, Test Loss: 0.09523987770080566\n", "Epoch 6761/10000, Training Loss: 0.02673870325088501, Test Loss: 0.09526631981134415\n", "Epoch 6762/10000, Training Loss: 0.02673308365046978, Test Loss: 0.09529311209917068\n", "Epoch 6763/10000, Training Loss: 0.026727551594376564, Test Loss: 0.09531895071268082\n", "Epoch 6764/10000, Training Loss: 0.026721922680735588, Test Loss: 0.09534535557031631\n", "Epoch 6765/10000, Training Loss: 0.02671629935503006, Test Loss: 0.0953722596168518\n", "Epoch 6766/10000, Training Loss: 0.026710757985711098, Test Loss: 0.09539838135242462\n", "Epoch 6767/10000, Training Loss: 0.026705153286457062, Test Loss: 0.0954243466258049\n", "Epoch 6768/10000, Training Loss: 0.026699593290686607, Test Loss: 0.09545207023620605\n", "Epoch 6769/10000, Training Loss: 0.02669398859143257, Test Loss: 0.09547711163759232\n", "Epoch 6770/10000, Training Loss: 0.02668840065598488, Test Loss: 0.09550449997186661\n", "Epoch 6771/10000, Training Loss: 0.026682782918214798, Test Loss: 0.09553083777427673\n", "Epoch 6772/10000, Training Loss: 0.026677221059799194, Test Loss: 0.09555657207965851\n", "Epoch 6773/10000, Training Loss: 0.026671620085835457, Test Loss: 0.09558424353599548\n", "Epoch 6774/10000, Training Loss: 0.026666052639484406, Test Loss: 0.09560999274253845\n", "Epoch 6775/10000, Training Loss: 0.026660514995455742, Test Loss: 0.09563679248094559\n", "Epoch 6776/10000, Training Loss: 0.026654891669750214, Test Loss: 0.09566371887922287\n", "Epoch 6777/10000, Training Loss: 0.026649318635463715, Test Loss: 0.09568984061479568\n", "Epoch 6778/10000, Training Loss: 0.026643775403499603, Test Loss: 0.09571640938520432\n", "Epoch 6779/10000, Training Loss: 0.0266382098197937, Test Loss: 0.09574344754219055\n", "Epoch 6780/10000, Training Loss: 0.02663262188434601, Test Loss: 0.09576941281557083\n", "Epoch 6781/10000, Training Loss: 0.02662706933915615, Test Loss: 0.09579671919345856\n", "Epoch 6782/10000, Training Loss: 0.026621464639902115, Test Loss: 0.09582343697547913\n", "Epoch 6783/10000, Training Loss: 0.026615895330905914, Test Loss: 0.09584884345531464\n", "Epoch 6784/10000, Training Loss: 0.026610372588038445, Test Loss: 0.09587742388248444\n", "Epoch 6785/10000, Training Loss: 0.026604780927300453, Test Loss: 0.09590242803096771\n", "Epoch 6786/10000, Training Loss: 0.026599235832691193, Test Loss: 0.0959300547838211\n", "Epoch 6787/10000, Training Loss: 0.026593679562211037, Test Loss: 0.09595678746700287\n", "Epoch 6788/10000, Training Loss: 0.026588110253214836, Test Loss: 0.09598270803689957\n", "Epoch 6789/10000, Training Loss: 0.026582540944218636, Test Loss: 0.09601116925477982\n", "Epoch 6790/10000, Training Loss: 0.02657698467373848, Test Loss: 0.09603593498468399\n", "Epoch 6791/10000, Training Loss: 0.026571447029709816, Test Loss: 0.09606397897005081\n", "Epoch 6792/10000, Training Loss: 0.02656589262187481, Test Loss: 0.0960904061794281\n", "Epoch 6793/10000, Training Loss: 0.026560356840491295, Test Loss: 0.09611690789461136\n", "Epoch 6794/10000, Training Loss: 0.026554785668849945, Test Loss: 0.09614421427249908\n", "Epoch 6795/10000, Training Loss: 0.02654925361275673, Test Loss: 0.09617090970277786\n", "Epoch 6796/10000, Training Loss: 0.026543665677309036, Test Loss: 0.09619759023189545\n", "Epoch 6797/10000, Training Loss: 0.026538124307990074, Test Loss: 0.09622476994991302\n", "Epoch 6798/10000, Training Loss: 0.0265326127409935, Test Loss: 0.09625114500522614\n", "Epoch 6799/10000, Training Loss: 0.026527054607868195, Test Loss: 0.09627873450517654\n", "Epoch 6800/10000, Training Loss: 0.026521505787968636, Test Loss: 0.09630484133958817\n", "Epoch 6801/10000, Training Loss: 0.026515979319810867, Test Loss: 0.09633246064186096\n", "Epoch 6802/10000, Training Loss: 0.026510441675782204, Test Loss: 0.09635912626981735\n", "Epoch 6803/10000, Training Loss: 0.02650490403175354, Test Loss: 0.0963858962059021\n", "Epoch 6804/10000, Training Loss: 0.02649938315153122, Test Loss: 0.09641384333372116\n", "Epoch 6805/10000, Training Loss: 0.026493847370147705, Test Loss: 0.09643948823213577\n", "Epoch 6806/10000, Training Loss: 0.02648836374282837, Test Loss: 0.09646774083375931\n", "Epoch 6807/10000, Training Loss: 0.02648279443383217, Test Loss: 0.0964936688542366\n", "Epoch 6808/10000, Training Loss: 0.026477256789803505, Test Loss: 0.09652113914489746\n", "Epoch 6809/10000, Training Loss: 0.02647170051932335, Test Loss: 0.09654852002859116\n", "Epoch 6810/10000, Training Loss: 0.026466207578778267, Test Loss: 0.09657491743564606\n", "Epoch 6811/10000, Training Loss: 0.026460669934749603, Test Loss: 0.09660236537456512\n", "Epoch 6812/10000, Training Loss: 0.026455175131559372, Test Loss: 0.09662974625825882\n", "Epoch 6813/10000, Training Loss: 0.02644963003695011, Test Loss: 0.09665599465370178\n", "Epoch 6814/10000, Training Loss: 0.026444122195243835, Test Loss: 0.09668414294719696\n", "Epoch 6815/10000, Training Loss: 0.026438621804118156, Test Loss: 0.0967111811041832\n", "Epoch 6816/10000, Training Loss: 0.026433050632476807, Test Loss: 0.09673673659563065\n", "Epoch 6817/10000, Training Loss: 0.026427585631608963, Test Loss: 0.09676668792963028\n", "Epoch 6818/10000, Training Loss: 0.02642206847667694, Test Loss: 0.0967908501625061\n", "Epoch 6819/10000, Training Loss: 0.02641654945909977, Test Loss: 0.09682007133960724\n", "Epoch 6820/10000, Training Loss: 0.026411017403006554, Test Loss: 0.09684732556343079\n", "Epoch 6821/10000, Training Loss: 0.026405515149235725, Test Loss: 0.0968722328543663\n", "Epoch 6822/10000, Training Loss: 0.026400037109851837, Test Loss: 0.0969032272696495\n", "Epoch 6823/10000, Training Loss: 0.026394536718726158, Test Loss: 0.09692665934562683\n", "Epoch 6824/10000, Training Loss: 0.026388976722955704, Test Loss: 0.09695665538311005\n", "Epoch 6825/10000, Training Loss: 0.026383500546216965, Test Loss: 0.096982941031456\n", "Epoch 6826/10000, Training Loss: 0.02637801319360733, Test Loss: 0.09700942784547806\n", "Epoch 6827/10000, Training Loss: 0.026372535154223442, Test Loss: 0.09703928977251053\n", "Epoch 6828/10000, Training Loss: 0.026367025449872017, Test Loss: 0.09706313163042068\n", "Epoch 6829/10000, Training Loss: 0.026361515745520592, Test Loss: 0.09709381312131882\n", "Epoch 6830/10000, Training Loss: 0.026356013491749763, Test Loss: 0.09711858630180359\n", "Epoch 6831/10000, Training Loss: 0.026350541040301323, Test Loss: 0.09714712202548981\n", "Epoch 6832/10000, Training Loss: 0.026344982907176018, Test Loss: 0.09717489033937454\n", "Epoch 6833/10000, Training Loss: 0.026339510455727577, Test Loss: 0.09720045328140259\n", "Epoch 6834/10000, Training Loss: 0.026334045454859734, Test Loss: 0.09723113477230072\n", "Epoch 6835/10000, Training Loss: 0.02632855996489525, Test Loss: 0.0972546860575676\n", "Epoch 6836/10000, Training Loss: 0.026323063299059868, Test Loss: 0.09728582203388214\n", "Epoch 6837/10000, Training Loss: 0.026317568495869637, Test Loss: 0.09731082618236542\n", "Epoch 6838/10000, Training Loss: 0.026312127709388733, Test Loss: 0.09733852744102478\n", "Epoch 6839/10000, Training Loss: 0.026306604966521263, Test Loss: 0.09736800193786621\n", "Epoch 6840/10000, Training Loss: 0.026301151141524315, Test Loss: 0.0973920002579689\n", "Epoch 6841/10000, Training Loss: 0.02629566751420498, Test Loss: 0.0974234789609909\n", "Epoch 6842/10000, Training Loss: 0.026290172711014748, Test Loss: 0.09744781255722046\n", "Epoch 6843/10000, Training Loss: 0.026284711435437202, Test Loss: 0.09747715294361115\n", "Epoch 6844/10000, Training Loss: 0.026279211044311523, Test Loss: 0.09750453382730484\n", "Epoch 6845/10000, Training Loss: 0.026273759081959724, Test Loss: 0.09753064811229706\n", "Epoch 6846/10000, Training Loss: 0.02626827359199524, Test Loss: 0.09756152331829071\n", "Epoch 6847/10000, Training Loss: 0.026262810453772545, Test Loss: 0.09758452326059341\n", "Epoch 6848/10000, Training Loss: 0.026257360354065895, Test Loss: 0.09761670976877213\n", "Epoch 6849/10000, Training Loss: 0.026251833885908127, Test Loss: 0.09764107316732407\n", "Epoch 6850/10000, Training Loss: 0.026246393099427223, Test Loss: 0.09766988456249237\n", "Epoch 6851/10000, Training Loss: 0.026240933686494827, Test Loss: 0.09769871830940247\n", "Epoch 6852/10000, Training Loss: 0.026235511526465416, Test Loss: 0.09772335737943649\n", "Epoch 6853/10000, Training Loss: 0.02623004838824272, Test Loss: 0.09775488823652267\n", "Epoch 6854/10000, Training Loss: 0.026224564760923386, Test Loss: 0.0977790504693985\n", "Epoch 6855/10000, Training Loss: 0.02621912769973278, Test Loss: 0.09780929237604141\n", "Epoch 6856/10000, Training Loss: 0.026213647797703743, Test Loss: 0.09783664345741272\n", "Epoch 6857/10000, Training Loss: 0.026208195835351944, Test Loss: 0.09786283224821091\n", "Epoch 6858/10000, Training Loss: 0.02620271034538746, Test Loss: 0.09789338707923889\n", "Epoch 6859/10000, Training Loss: 0.026197293773293495, Test Loss: 0.09791785478591919\n", "Epoch 6860/10000, Training Loss: 0.026191843673586845, Test Loss: 0.09794863313436508\n", "Epoch 6861/10000, Training Loss: 0.026186391711235046, Test Loss: 0.09797447919845581\n", "Epoch 6862/10000, Training Loss: 0.026180945336818695, Test Loss: 0.09800293296575546\n", "Epoch 6863/10000, Training Loss: 0.02617545984685421, Test Loss: 0.09803149849176407\n", "Epoch 6864/10000, Training Loss: 0.026170019060373306, Test Loss: 0.09805765002965927\n", "Epoch 6865/10000, Training Loss: 0.026164580136537552, Test Loss: 0.09808789193630219\n", "Epoch 6866/10000, Training Loss: 0.026159130036830902, Test Loss: 0.09811338782310486\n", "Epoch 6867/10000, Training Loss: 0.026153715327382088, Test Loss: 0.09814298897981644\n", "Epoch 6868/10000, Training Loss: 0.026148270815610886, Test Loss: 0.0981702134013176\n", "Epoch 6869/10000, Training Loss: 0.026142818853259087, Test Loss: 0.09819813072681427\n", "Epoch 6870/10000, Training Loss: 0.02613738551735878, Test Loss: 0.09822656214237213\n", "Epoch 6871/10000, Training Loss: 0.026131950318813324, Test Loss: 0.0982542634010315\n", "Epoch 6872/10000, Training Loss: 0.02612650953233242, Test Loss: 0.0982818454504013\n", "Epoch 6873/10000, Training Loss: 0.026121074333786964, Test Loss: 0.0983109101653099\n", "Epoch 6874/10000, Training Loss: 0.026115672662854195, Test Loss: 0.0983375608921051\n", "Epoch 6875/10000, Training Loss: 0.026110246777534485, Test Loss: 0.09836725890636444\n", "Epoch 6876/10000, Training Loss: 0.026104791089892387, Test Loss: 0.0983934998512268\n", "Epoch 6877/10000, Training Loss: 0.026099365204572678, Test Loss: 0.09842275828123093\n", "Epoch 6878/10000, Training Loss: 0.026093965396285057, Test Loss: 0.09845077246427536\n", "Epoch 6879/10000, Training Loss: 0.02608853578567505, Test Loss: 0.09847754240036011\n", "Epoch 6880/10000, Training Loss: 0.026083115488290787, Test Loss: 0.0985083058476448\n", "Epoch 6881/10000, Training Loss: 0.026077713817358017, Test Loss: 0.09853318333625793\n", "Epoch 6882/10000, Training Loss: 0.026072239503264427, Test Loss: 0.09856440126895905\n", "Epoch 6883/10000, Training Loss: 0.026066849008202553, Test Loss: 0.0985904186964035\n", "Epoch 6884/10000, Training Loss: 0.026061441749334335, Test Loss: 0.09861985594034195\n", "Epoch 6885/10000, Training Loss: 0.026056012138724327, Test Loss: 0.09864728897809982\n", "Epoch 6886/10000, Training Loss: 0.026050573214888573, Test Loss: 0.09867557883262634\n", "Epoch 6887/10000, Training Loss: 0.02604515850543976, Test Loss: 0.0987047478556633\n", "Epoch 6888/10000, Training Loss: 0.026039782911539078, Test Loss: 0.09873108565807343\n", "Epoch 6889/10000, Training Loss: 0.026034435257315636, Test Loss: 0.0987619012594223\n", "Epoch 6890/10000, Training Loss: 0.02602897584438324, Test Loss: 0.09878712892532349\n", "Epoch 6891/10000, Training Loss: 0.026023516431450844, Test Loss: 0.09881892800331116\n", "Epoch 6892/10000, Training Loss: 0.026018129661679268, Test Loss: 0.09884335845708847\n", "Epoch 6893/10000, Training Loss: 0.026012729853391647, Test Loss: 0.09887511283159256\n", "Epoch 6894/10000, Training Loss: 0.02600734867155552, Test Loss: 0.09890096634626389\n", "Epoch 6895/10000, Training Loss: 0.026001958176493645, Test Loss: 0.09893083572387695\n", "Epoch 6896/10000, Training Loss: 0.025996530428528786, Test Loss: 0.09895873069763184\n", "Epoch 6897/10000, Training Loss: 0.02599116414785385, Test Loss: 0.09898638725280762\n", "Epoch 6898/10000, Training Loss: 0.02598574385046959, Test Loss: 0.09901687502861023\n", "Epoch 6899/10000, Training Loss: 0.025980325415730476, Test Loss: 0.09904178977012634\n", "Epoch 6900/10000, Training Loss: 0.025974934920668602, Test Loss: 0.09907438606023788\n", "Epoch 6901/10000, Training Loss: 0.025969544425606728, Test Loss: 0.09909859299659729\n", "Epoch 6902/10000, Training Loss: 0.02596413530409336, Test Loss: 0.09913112968206406\n", "Epoch 6903/10000, Training Loss: 0.025958744809031487, Test Loss: 0.09915574640035629\n", "Epoch 6904/10000, Training Loss: 0.025953384116292, Test Loss: 0.09918782860040665\n", "Epoch 6905/10000, Training Loss: 0.02594802901148796, Test Loss: 0.09921322762966156\n", "Epoch 6906/10000, Training Loss: 0.025942621752619743, Test Loss: 0.09924446046352386\n", "Epoch 6907/10000, Training Loss: 0.02593722566962242, Test Loss: 0.09927040338516235\n", "Epoch 6908/10000, Training Loss: 0.025931857526302338, Test Loss: 0.09930117428302765\n", "Epoch 6909/10000, Training Loss: 0.025926459580659866, Test Loss: 0.09932824224233627\n", "Epoch 6910/10000, Training Loss: 0.025921104475855827, Test Loss: 0.09935729205608368\n", "Epoch 6911/10000, Training Loss: 0.025915713980793953, Test Loss: 0.09938644617795944\n", "Epoch 6912/10000, Training Loss: 0.025910314172506332, Test Loss: 0.0994139239192009\n", "Epoch 6913/10000, Training Loss: 0.025904951617121696, Test Loss: 0.09944401681423187\n", "Epoch 6914/10000, Training Loss: 0.025899585336446762, Test Loss: 0.09947117418050766\n", "Epoch 6915/10000, Training Loss: 0.02589423768222332, Test Loss: 0.09950092434883118\n", "Epoch 6916/10000, Training Loss: 0.02588883601129055, Test Loss: 0.09952881187200546\n", "Epoch 6917/10000, Training Loss: 0.025883520022034645, Test Loss: 0.09955791383981705\n", "Epoch 6918/10000, Training Loss: 0.025878125801682472, Test Loss: 0.09958676993846893\n", "Epoch 6919/10000, Training Loss: 0.025872742757201195, Test Loss: 0.09961452335119247\n", "Epoch 6920/10000, Training Loss: 0.025867346674203873, Test Loss: 0.09964507073163986\n", "Epoch 6921/10000, Training Loss: 0.025861987844109535, Test Loss: 0.09967169165611267\n", "Epoch 6922/10000, Training Loss: 0.025856658816337585, Test Loss: 0.09970251470804214\n", "Epoch 6923/10000, Training Loss: 0.025851277634501457, Test Loss: 0.0997290313243866\n", "Epoch 6924/10000, Training Loss: 0.025845935568213463, Test Loss: 0.09976048022508621\n", "Epoch 6925/10000, Training Loss: 0.02584059163928032, Test Loss: 0.0997864156961441\n", "Epoch 6926/10000, Training Loss: 0.02583521045744419, Test Loss: 0.09981806576251984\n", "Epoch 6927/10000, Training Loss: 0.025829816237092018, Test Loss: 0.09984440356492996\n", "Epoch 6928/10000, Training Loss: 0.02582450956106186, Test Loss: 0.09987515211105347\n", "Epoch 6929/10000, Training Loss: 0.025819163769483566, Test Loss: 0.09990289807319641\n", "Epoch 6930/10000, Training Loss: 0.025813793763518333, Test Loss: 0.09993273764848709\n", "Epoch 6931/10000, Training Loss: 0.025808431208133698, Test Loss: 0.09996034950017929\n", "Epoch 6932/10000, Training Loss: 0.025803077965974808, Test Loss: 0.09999062120914459\n", "Epoch 6933/10000, Training Loss: 0.02579774148762226, Test Loss: 0.10001843422651291\n", "Epoch 6934/10000, Training Loss: 0.025792380794882774, Test Loss: 0.10004787147045135\n", "Epoch 6935/10000, Training Loss: 0.025787044316530228, Test Loss: 0.1000775694847107\n", "Epoch 6936/10000, Training Loss: 0.02578173391520977, Test Loss: 0.10010453313589096\n", "Epoch 6937/10000, Training Loss: 0.025776391848921776, Test Loss: 0.10013728588819504\n", "Epoch 6938/10000, Training Loss: 0.025771059095859528, Test Loss: 0.10016066581010818\n", "Epoch 6939/10000, Training Loss: 0.02576572820544243, Test Loss: 0.1001967042684555\n", "Epoch 6940/10000, Training Loss: 0.025760356336832047, Test Loss: 0.10021781921386719\n", "Epoch 6941/10000, Training Loss: 0.02575504221022129, Test Loss: 0.10025535523891449\n", "Epoch 6942/10000, Training Loss: 0.025749685242772102, Test Loss: 0.10027602314949036\n", "Epoch 6943/10000, Training Loss: 0.025744378566741943, Test Loss: 0.10031277686357498\n", "Epoch 6944/10000, Training Loss: 0.025739029049873352, Test Loss: 0.10033529996871948\n", "Epoch 6945/10000, Training Loss: 0.0257337037473917, Test Loss: 0.1003698855638504\n", "Epoch 6946/10000, Training Loss: 0.02572835609316826, Test Loss: 0.1003950908780098\n", "Epoch 6947/10000, Training Loss: 0.025723058730363846, Test Loss: 0.10042639076709747\n", "Epoch 6948/10000, Training Loss: 0.02571772038936615, Test Loss: 0.10045504570007324\n", "Epoch 6949/10000, Training Loss: 0.025712409988045692, Test Loss: 0.10048322379589081\n", "Epoch 6950/10000, Training Loss: 0.025707097724080086, Test Loss: 0.1005149558186531\n", "Epoch 6951/10000, Training Loss: 0.02570175565779209, Test Loss: 0.10054018348455429\n", "Epoch 6952/10000, Training Loss: 0.025696441531181335, Test Loss: 0.10057491809129715\n", "Epoch 6953/10000, Training Loss: 0.025691121816635132, Test Loss: 0.10059747844934464\n", "Epoch 6954/10000, Training Loss: 0.025685792788863182, Test Loss: 0.10063486546278\n", "Epoch 6955/10000, Training Loss: 0.025680484250187874, Test Loss: 0.100654236972332\n", "Epoch 6956/10000, Training Loss: 0.02567516826093197, Test Loss: 0.10069511085748672\n", "Epoch 6957/10000, Training Loss: 0.02566986344754696, Test Loss: 0.10071119666099548\n", "Epoch 6958/10000, Training Loss: 0.025664526969194412, Test Loss: 0.1007552370429039\n", "Epoch 6959/10000, Training Loss: 0.025659214705228806, Test Loss: 0.10076858848333359\n", "Epoch 6960/10000, Training Loss: 0.025653913617134094, Test Loss: 0.1008153185248375\n", "Epoch 6961/10000, Training Loss: 0.02564862370491028, Test Loss: 0.10082586854696274\n", "Epoch 6962/10000, Training Loss: 0.025643322616815567, Test Loss: 0.10087611526250839\n", "Epoch 6963/10000, Training Loss: 0.02563801221549511, Test Loss: 0.1008824035525322\n", "Epoch 6964/10000, Training Loss: 0.025632690638303757, Test Loss: 0.10093696415424347\n", "Epoch 6965/10000, Training Loss: 0.025627367198467255, Test Loss: 0.10093916952610016\n", "Epoch 6966/10000, Training Loss: 0.025622058659791946, Test Loss: 0.10099810361862183\n", "Epoch 6967/10000, Training Loss: 0.025616800412535667, Test Loss: 0.10099590569734573\n", "Epoch 6968/10000, Training Loss: 0.025611495599150658, Test Loss: 0.10105933994054794\n", "Epoch 6969/10000, Training Loss: 0.02560618333518505, Test Loss: 0.10105222463607788\n", "Epoch 6970/10000, Training Loss: 0.02560092695057392, Test Loss: 0.10112201422452927\n", "Epoch 6971/10000, Training Loss: 0.025595616549253464, Test Loss: 0.10110678523778915\n", "Epoch 6972/10000, Training Loss: 0.025590337812900543, Test Loss: 0.10118626058101654\n", "Epoch 6973/10000, Training Loss: 0.025585029274225235, Test Loss: 0.10115961730480194\n", "Epoch 6974/10000, Training Loss: 0.025579731911420822, Test Loss: 0.10125305503606796\n", "Epoch 6975/10000, Training Loss: 0.025574438273906708, Test Loss: 0.10120991617441177\n", "Epoch 6976/10000, Training Loss: 0.025569181889295578, Test Loss: 0.10132284462451935\n", "Epoch 6977/10000, Training Loss: 0.02556389570236206, Test Loss: 0.10125631093978882\n", "Epoch 6978/10000, Training Loss: 0.02555864118039608, Test Loss: 0.10139768570661545\n", "Epoch 6979/10000, Training Loss: 0.025553330779075623, Test Loss: 0.10129695385694504\n", "Epoch 6980/10000, Training Loss: 0.025548074394464493, Test Loss: 0.10148002952337265\n", "Epoch 6981/10000, Training Loss: 0.02554282359778881, Test Loss: 0.10132791101932526\n", "Epoch 6982/10000, Training Loss: 0.025537539273500443, Test Loss: 0.10157427936792374\n", "Epoch 6983/10000, Training Loss: 0.025532295927405357, Test Loss: 0.10134483128786087\n", "Epoch 6984/10000, Training Loss: 0.025527069345116615, Test Loss: 0.10168488323688507\n", "Epoch 6985/10000, Training Loss: 0.025521868839859962, Test Loss: 0.10134265571832657\n", "Epoch 6986/10000, Training Loss: 0.025516647845506668, Test Loss: 0.10181744396686554\n", "Epoch 6987/10000, Training Loss: 0.025511467829346657, Test Loss: 0.1013161689043045\n", "Epoch 6988/10000, Training Loss: 0.02550635300576687, Test Loss: 0.10197558999061584\n", "Epoch 6989/10000, Training Loss: 0.025501279160380363, Test Loss: 0.1012660562992096\n", "Epoch 6990/10000, Training Loss: 0.025496279820799828, Test Loss: 0.10214834660291672\n", "Epoch 6991/10000, Training Loss: 0.02549128793179989, Test Loss: 0.1012185588479042\n", "Epoch 6992/10000, Training Loss: 0.025486374273896217, Test Loss: 0.10228913277387619\n", "Epoch 6993/10000, Training Loss: 0.025481289252638817, Test Loss: 0.10124408453702927\n", "Epoch 6994/10000, Training Loss: 0.025476081296801567, Test Loss: 0.10231003165245056\n", "Epoch 6995/10000, Training Loss: 0.025470545515418053, Test Loss: 0.10142924636602402\n", "Epoch 6996/10000, Training Loss: 0.025464806705713272, Test Loss: 0.10215873271226883\n", "Epoch 6997/10000, Training Loss: 0.02545897103846073, Test Loss: 0.10176332294940948\n", "Epoch 6998/10000, Training Loss: 0.0254533551633358, Test Loss: 0.10191727429628372\n", "Epoch 6999/10000, Training Loss: 0.025447970256209373, Test Loss: 0.10210935771465302\n", "Epoch 7000/10000, Training Loss: 0.025442911311984062, Test Loss: 0.10173846036195755\n", "Epoch 7001/10000, Training Loss: 0.025437889620661736, Test Loss: 0.10233116894960403\n", "Epoch 7002/10000, Training Loss: 0.025432879105210304, Test Loss: 0.10172436386346817\n", "Epoch 7003/10000, Training Loss: 0.025427687913179398, Test Loss: 0.102369524538517\n", "Epoch 7004/10000, Training Loss: 0.025422336533665657, Test Loss: 0.10189006477594376\n", "Epoch 7005/10000, Training Loss: 0.02541685663163662, Test Loss: 0.10225894302129745\n", "Epoch 7006/10000, Training Loss: 0.025411395356059074, Test Loss: 0.10215333849191666\n", "Epoch 7007/10000, Training Loss: 0.025406066328287125, Test Loss: 0.10211426019668579\n", "Epoch 7008/10000, Training Loss: 0.025400875136256218, Test Loss: 0.1023867204785347\n", "Epoch 7009/10000, Training Loss: 0.025395767763257027, Test Loss: 0.10205143690109253\n", "Epoch 7010/10000, Training Loss: 0.025390636175870895, Test Loss: 0.10250299423933029\n", "Epoch 7011/10000, Training Loss: 0.025385402143001556, Test Loss: 0.10212136059999466\n", "Epoch 7012/10000, Training Loss: 0.02538013644516468, Test Loss: 0.10249246656894684\n", "Epoch 7013/10000, Training Loss: 0.02537480555474758, Test Loss: 0.10229489207267761\n", "Epoch 7014/10000, Training Loss: 0.02536947652697563, Test Loss: 0.10241576284170151\n", "Epoch 7015/10000, Training Loss: 0.02536427229642868, Test Loss: 0.10249122977256775\n", "Epoch 7016/10000, Training Loss: 0.025359030812978745, Test Loss: 0.10235846042633057\n", "Epoch 7017/10000, Training Loss: 0.02535387873649597, Test Loss: 0.10263366997241974\n", "Epoch 7018/10000, Training Loss: 0.02534867823123932, Test Loss: 0.10237786173820496\n", "Epoch 7019/10000, Training Loss: 0.025343509390950203, Test Loss: 0.10268968343734741\n", "Epoch 7020/10000, Training Loss: 0.025338226929306984, Test Loss: 0.10248024761676788\n", "Epoch 7021/10000, Training Loss: 0.025332972407341003, Test Loss: 0.1026780977845192\n", "Epoch 7022/10000, Training Loss: 0.025327732786536217, Test Loss: 0.10262778401374817\n", "Epoch 7023/10000, Training Loss: 0.02532249130308628, Test Loss: 0.10264823585748672\n", "Epoch 7024/10000, Training Loss: 0.02531725913286209, Test Loss: 0.10276779532432556\n", "Epoch 7025/10000, Training Loss: 0.025312082841992378, Test Loss: 0.1026478037238121\n", "Epoch 7026/10000, Training Loss: 0.025306914001703262, Test Loss: 0.10286286473274231\n", "Epoch 7027/10000, Training Loss: 0.025301704183220863, Test Loss: 0.10270046442747116\n", "Epoch 7028/10000, Training Loss: 0.025296496227383614, Test Loss: 0.10290516167879105\n", "Epoch 7029/10000, Training Loss: 0.025291278958320618, Test Loss: 0.10279905050992966\n", "Epoch 7030/10000, Training Loss: 0.025286050513386726, Test Loss: 0.10291392356157303\n", "Epoch 7031/10000, Training Loss: 0.025280842557549477, Test Loss: 0.10291627049446106\n", "Epoch 7032/10000, Training Loss: 0.02527560666203499, Test Loss: 0.10291937738656998\n", "Epoch 7033/10000, Training Loss: 0.025270437821745872, Test Loss: 0.10302329063415527\n", "Epoch 7034/10000, Training Loss: 0.025265244767069817, Test Loss: 0.10294639319181442\n", "Epoch 7035/10000, Training Loss: 0.02526004984974861, Test Loss: 0.10310226678848267\n", "Epoch 7036/10000, Training Loss: 0.025254853069782257, Test Loss: 0.10300356149673462\n", "Epoch 7037/10000, Training Loss: 0.025249669328331947, Test Loss: 0.10315284878015518\n", "Epoch 7038/10000, Training Loss: 0.02524445950984955, Test Loss: 0.10308542102575302\n", "Epoch 7039/10000, Training Loss: 0.02523927390575409, Test Loss: 0.10318360477685928\n", "Epoch 7040/10000, Training Loss: 0.025234071537852287, Test Loss: 0.10318056493997574\n", "Epoch 7041/10000, Training Loss: 0.02522888407111168, Test Loss: 0.10321017354726791\n", "Epoch 7042/10000, Training Loss: 0.025223679840564728, Test Loss: 0.10327211767435074\n", "Epoch 7043/10000, Training Loss: 0.025218525901436806, Test Loss: 0.10324626415967941\n", "Epoch 7044/10000, Training Loss: 0.02521332912147045, Test Loss: 0.10334991663694382\n", "Epoch 7045/10000, Training Loss: 0.025208136066794395, Test Loss: 0.10329893976449966\n", "Epoch 7046/10000, Training Loss: 0.025202957913279533, Test Loss: 0.10341111570596695\n", "Epoch 7047/10000, Training Loss: 0.02519778348505497, Test Loss: 0.10336657613515854\n", "Epoch 7048/10000, Training Loss: 0.025192538276314735, Test Loss: 0.10346011072397232\n", "Epoch 7049/10000, Training Loss: 0.025187401100993156, Test Loss: 0.10344375669956207\n", "Epoch 7050/10000, Training Loss: 0.025182249024510384, Test Loss: 0.10350306332111359\n", "Epoch 7051/10000, Training Loss: 0.02517707087099552, Test Loss: 0.10352340340614319\n", "Epoch 7052/10000, Training Loss: 0.025171898305416107, Test Loss: 0.10354693233966827\n", "Epoch 7053/10000, Training Loss: 0.025166723877191544, Test Loss: 0.10359997302293777\n", "Epoch 7054/10000, Training Loss: 0.025161519646644592, Test Loss: 0.10359545052051544\n", "Epoch 7055/10000, Training Loss: 0.025156373158097267, Test Loss: 0.10367050021886826\n", "Epoch 7056/10000, Training Loss: 0.025151167064905167, Test Loss: 0.1036517545580864\n", "Epoch 7057/10000, Training Loss: 0.02514600194990635, Test Loss: 0.1037331372499466\n", "Epoch 7058/10000, Training Loss: 0.025140870362520218, Test Loss: 0.10371533036231995\n", "Epoch 7059/10000, Training Loss: 0.02513568475842476, Test Loss: 0.10379014164209366\n", "Epoch 7060/10000, Training Loss: 0.025130495429039, Test Loss: 0.10378288477659225\n", "Epoch 7061/10000, Training Loss: 0.025125380605459213, Test Loss: 0.1038442999124527\n", "Epoch 7062/10000, Training Loss: 0.025120209902524948, Test Loss: 0.10385287553071976\n", "Epoch 7063/10000, Training Loss: 0.025115033611655235, Test Loss: 0.10389735549688339\n", "Epoch 7064/10000, Training Loss: 0.025109879672527313, Test Loss: 0.10392311960458755\n", "Epoch 7065/10000, Training Loss: 0.02510470151901245, Test Loss: 0.10395095497369766\n", "Epoch 7066/10000, Training Loss: 0.025099577382206917, Test Loss: 0.10399193316698074\n", "Epoch 7067/10000, Training Loss: 0.02509438432753086, Test Loss: 0.10400693118572235\n", "Epoch 7068/10000, Training Loss: 0.025089262053370476, Test Loss: 0.10405819118022919\n", "Epoch 7069/10000, Training Loss: 0.02508409507572651, Test Loss: 0.10406574606895447\n", "Epoch 7070/10000, Training Loss: 0.025078950449824333, Test Loss: 0.10412149131298065\n", "Epoch 7071/10000, Training Loss: 0.02507379651069641, Test Loss: 0.1041274294257164\n", "Epoch 7072/10000, Training Loss: 0.025068676099181175, Test Loss: 0.10418365895748138\n", "Epoch 7073/10000, Training Loss: 0.025063494220376015, Test Loss: 0.1041901484131813\n", "Epoch 7074/10000, Training Loss: 0.02505834586918354, Test Loss: 0.10424381494522095\n", "Epoch 7075/10000, Training Loss: 0.02505321055650711, Test Loss: 0.10425447672605515\n", "Epoch 7076/10000, Training Loss: 0.025048041716217995, Test Loss: 0.10430338978767395\n", "Epoch 7077/10000, Training Loss: 0.025042949244379997, Test Loss: 0.10431933403015137\n", "Epoch 7078/10000, Training Loss: 0.025037813931703568, Test Loss: 0.10436298698186874\n", "Epoch 7079/10000, Training Loss: 0.02503267303109169, Test Loss: 0.10438400506973267\n", "Epoch 7080/10000, Training Loss: 0.025027494877576828, Test Loss: 0.10442274063825607\n", "Epoch 7081/10000, Training Loss: 0.025022383779287338, Test Loss: 0.10444851219654083\n", "Epoch 7082/10000, Training Loss: 0.025017255917191505, Test Loss: 0.10448309779167175\n", "Epoch 7083/10000, Training Loss: 0.025012098252773285, Test Loss: 0.10451269149780273\n", "Epoch 7084/10000, Training Loss: 0.025006966665387154, Test Loss: 0.10454414784908295\n", "Epoch 7085/10000, Training Loss: 0.02500181458890438, Test Loss: 0.10457587242126465\n", "Epoch 7086/10000, Training Loss: 0.02499670349061489, Test Loss: 0.1046062484383583\n", "Epoch 7087/10000, Training Loss: 0.02499159425497055, Test Loss: 0.10463829338550568\n", "Epoch 7088/10000, Training Loss: 0.024986468255519867, Test Loss: 0.10466942191123962\n", "Epoch 7089/10000, Training Loss: 0.02498132549226284, Test Loss: 0.10470006614923477\n", "Epoch 7090/10000, Training Loss: 0.024976225569844246, Test Loss: 0.1047326996922493\n", "Epoch 7091/10000, Training Loss: 0.02497107721865177, Test Loss: 0.10476233065128326\n", "Epoch 7092/10000, Training Loss: 0.024965941905975342, Test Loss: 0.10479610413312912\n", "Epoch 7093/10000, Training Loss: 0.024960847571492195, Test Loss: 0.10482373088598251\n", "Epoch 7094/10000, Training Loss: 0.02495572529733181, Test Loss: 0.10486013442277908\n", "Epoch 7095/10000, Training Loss: 0.02495061419904232, Test Loss: 0.10488548874855042\n", "Epoch 7096/10000, Training Loss: 0.024945437908172607, Test Loss: 0.1049237996339798\n", "Epoch 7097/10000, Training Loss: 0.024940362200140953, Test Loss: 0.10494765639305115\n", "Epoch 7098/10000, Training Loss: 0.02493525855243206, Test Loss: 0.10498777031898499\n", "Epoch 7099/10000, Training Loss: 0.024930140003561974, Test Loss: 0.10500888526439667\n", "Epoch 7100/10000, Training Loss: 0.02492501214146614, Test Loss: 0.10505266487598419\n", "Epoch 7101/10000, Training Loss: 0.024919895455241203, Test Loss: 0.10506971925497055\n", "Epoch 7102/10000, Training Loss: 0.02491476759314537, Test Loss: 0.10511820018291473\n", "Epoch 7103/10000, Training Loss: 0.02490966208279133, Test Loss: 0.1051299050450325\n", "Epoch 7104/10000, Training Loss: 0.024904543533921242, Test Loss: 0.10518467426300049\n", "Epoch 7105/10000, Training Loss: 0.024899471551179886, Test Loss: 0.10518880188465118\n", "Epoch 7106/10000, Training Loss: 0.02489437535405159, Test Loss: 0.10525289177894592\n", "Epoch 7107/10000, Training Loss: 0.024889247491955757, Test Loss: 0.10524605959653854\n", "Epoch 7108/10000, Training Loss: 0.0248841792345047, Test Loss: 0.10532364249229431\n", "Epoch 7109/10000, Training Loss: 0.024879053235054016, Test Loss: 0.10529995709657669\n", "Epoch 7110/10000, Training Loss: 0.02487395517528057, Test Loss: 0.10539882630109787\n", "Epoch 7111/10000, Training Loss: 0.024868860840797424, Test Loss: 0.10534869879484177\n", "Epoch 7112/10000, Training Loss: 0.024863775819540024, Test Loss: 0.10548022389411926\n", "Epoch 7113/10000, Training Loss: 0.024858687072992325, Test Loss: 0.10538942366838455\n", "Epoch 7114/10000, Training Loss: 0.024853551760315895, Test Loss: 0.10557227581739426\n", "Epoch 7115/10000, Training Loss: 0.024848520755767822, Test Loss: 0.10541675239801407\n", "Epoch 7116/10000, Training Loss: 0.024843452498316765, Test Loss: 0.10568220168352127\n", "Epoch 7117/10000, Training Loss: 0.024838395416736603, Test Loss: 0.10542112588882446\n", "Epoch 7118/10000, Training Loss: 0.024833356961607933, Test Loss: 0.10582025349140167\n", "Epoch 7119/10000, Training Loss: 0.024828359484672546, Test Loss: 0.10539117455482483\n", "Epoch 7120/10000, Training Loss: 0.024823416024446487, Test Loss: 0.10599938035011292\n", "Epoch 7121/10000, Training Loss: 0.02481856383383274, Test Loss: 0.10531292855739594\n", "Epoch 7122/10000, Training Loss: 0.024813786149024963, Test Loss: 0.1062307134270668\n", "Epoch 7123/10000, Training Loss: 0.024809176102280617, Test Loss: 0.10518345981836319\n", "Epoch 7124/10000, Training Loss: 0.02480476163327694, Test Loss: 0.1064930185675621\n", "Epoch 7125/10000, Training Loss: 0.024800417944788933, Test Loss: 0.10507050156593323\n", "Epoch 7126/10000, Training Loss: 0.02479606680572033, Test Loss: 0.10665176063776016\n", "Epoch 7127/10000, Training Loss: 0.024791110306978226, Test Loss: 0.10517394542694092\n", "Epoch 7128/10000, Training Loss: 0.024785546585917473, Test Loss: 0.10649628937244415\n", "Epoch 7129/10000, Training Loss: 0.02477927692234516, Test Loss: 0.10563197731971741\n", "Epoch 7130/10000, Training Loss: 0.024772942066192627, Test Loss: 0.10605311393737793\n", "Epoch 7131/10000, Training Loss: 0.024767238646745682, Test Loss: 0.10622123628854752\n", "Epoch 7132/10000, Training Loss: 0.024762388318777084, Test Loss: 0.1056651622056961\n", "Epoch 7133/10000, Training Loss: 0.024758046492934227, Test Loss: 0.10657938569784164\n", "Epoch 7134/10000, Training Loss: 0.024753473699092865, Test Loss: 0.1056278795003891\n", "Epoch 7135/10000, Training Loss: 0.02474847435951233, Test Loss: 0.10653343051671982\n", "Epoch 7136/10000, Training Loss: 0.02474280819296837, Test Loss: 0.10597440600395203\n", "Epoch 7137/10000, Training Loss: 0.02473715879023075, Test Loss: 0.10621289163827896\n", "Epoch 7138/10000, Training Loss: 0.02473183535039425, Test Loss: 0.10642814636230469\n", "Epoch 7139/10000, Training Loss: 0.024727007374167442, Test Loss: 0.10596243292093277\n", "Epoch 7140/10000, Training Loss: 0.024722345173358917, Test Loss: 0.10666143149137497\n", "Epoch 7141/10000, Training Loss: 0.02471744827926159, Test Loss: 0.10601982474327087\n", "Epoch 7142/10000, Training Loss: 0.024712275713682175, Test Loss: 0.10657594352960587\n", "Epoch 7143/10000, Training Loss: 0.024706879630684853, Test Loss: 0.10633748024702072\n", "Epoch 7144/10000, Training Loss: 0.02470160834491253, Test Loss: 0.10634706169366837\n", "Epoch 7145/10000, Training Loss: 0.024696556851267815, Test Loss: 0.10665743052959442\n", "Epoch 7146/10000, Training Loss: 0.02469177171587944, Test Loss: 0.10624116659164429\n", "Epoch 7147/10000, Training Loss: 0.0246869046241045, Test Loss: 0.10676896572113037\n", "Epoch 7148/10000, Training Loss: 0.024681799113750458, Test Loss: 0.1063745990395546\n", "Epoch 7149/10000, Training Loss: 0.024676591157913208, Test Loss: 0.10667017847299576\n", "Epoch 7150/10000, Training Loss: 0.024671413004398346, Test Loss: 0.1066446527838707\n", "Epoch 7151/10000, Training Loss: 0.02466636151075363, Test Loss: 0.10653334110975266\n", "Epoch 7152/10000, Training Loss: 0.024661393836140633, Test Loss: 0.1068548709154129\n", "Epoch 7153/10000, Training Loss: 0.024656489491462708, Test Loss: 0.1065293699502945\n", "Epoch 7154/10000, Training Loss: 0.024651477113366127, Test Loss: 0.10689777135848999\n", "Epoch 7155/10000, Training Loss: 0.024646421894431114, Test Loss: 0.10668669641017914\n", "Epoch 7156/10000, Training Loss: 0.02464129403233528, Test Loss: 0.10682196170091629\n", "Epoch 7157/10000, Training Loss: 0.02463621459901333, Test Loss: 0.1068984717130661\n", "Epoch 7158/10000, Training Loss: 0.024631185457110405, Test Loss: 0.10676141083240509\n", "Epoch 7159/10000, Training Loss: 0.024626223370432854, Test Loss: 0.10703622549772263\n", "Epoch 7160/10000, Training Loss: 0.02462126314640045, Test Loss: 0.10681115835905075\n", "Epoch 7161/10000, Training Loss: 0.024616222828626633, Test Loss: 0.10705681890249252\n", "Epoch 7162/10000, Training Loss: 0.024611154571175575, Test Loss: 0.10695955902338028\n", "Epoch 7163/10000, Training Loss: 0.02460615150630474, Test Loss: 0.10701647400856018\n", "Epoch 7164/10000, Training Loss: 0.024601129814982414, Test Loss: 0.10712242126464844\n", "Epoch 7165/10000, Training Loss: 0.024596059694886208, Test Loss: 0.10700663179159164\n", "Epoch 7166/10000, Training Loss: 0.024591099470853806, Test Loss: 0.10722178220748901\n", "Epoch 7167/10000, Training Loss: 0.024586105719208717, Test Loss: 0.10707622766494751\n", "Epoch 7168/10000, Training Loss: 0.024581123143434525, Test Loss: 0.10724429041147232\n", "Epoch 7169/10000, Training Loss: 0.024576088413596153, Test Loss: 0.10720546543598175\n", "Epoch 7170/10000, Training Loss: 0.02457106113433838, Test Loss: 0.10723528265953064\n", "Epoch 7171/10000, Training Loss: 0.02456607110798359, Test Loss: 0.1073371097445488\n", "Epoch 7172/10000, Training Loss: 0.024561069905757904, Test Loss: 0.10725244134664536\n", "Epoch 7173/10000, Training Loss: 0.02455606497824192, Test Loss: 0.10742111504077911\n", "Epoch 7174/10000, Training Loss: 0.024551058188080788, Test Loss: 0.10732359439134598\n", "Epoch 7175/10000, Training Loss: 0.02454608678817749, Test Loss: 0.10745502263307571\n", "Epoch 7176/10000, Training Loss: 0.024541037157177925, Test Loss: 0.10743416100740433\n", "Epoch 7177/10000, Training Loss: 0.024536024779081345, Test Loss: 0.10746801644563675\n", "Epoch 7178/10000, Training Loss: 0.02453104965388775, Test Loss: 0.10754577815532684\n", "Epoch 7179/10000, Training Loss: 0.024526039138436317, Test Loss: 0.107498399913311\n", "Epoch 7180/10000, Training Loss: 0.024521080777049065, Test Loss: 0.10762709379196167\n", "Epoch 7181/10000, Training Loss: 0.024516087025403976, Test Loss: 0.10756490379571915\n", "Epoch 7182/10000, Training Loss: 0.024511083960533142, Test Loss: 0.1076737567782402\n", "Epoch 7183/10000, Training Loss: 0.024506060406565666, Test Loss: 0.10765963047742844\n", "Epoch 7184/10000, Training Loss: 0.024501122534275055, Test Loss: 0.10770361870527267\n", "Epoch 7185/10000, Training Loss: 0.024496108293533325, Test Loss: 0.10775794833898544\n", "Epoch 7186/10000, Training Loss: 0.024491174146533012, Test Loss: 0.10774220526218414\n", "Epoch 7187/10000, Training Loss: 0.02448613941669464, Test Loss: 0.10783836245536804\n", "Epoch 7188/10000, Training Loss: 0.024481147527694702, Test Loss: 0.10780375450849533\n", "Epoch 7189/10000, Training Loss: 0.024476176127791405, Test Loss: 0.10789816826581955\n", "Epoch 7190/10000, Training Loss: 0.024471180513501167, Test Loss: 0.1078813299536705\n", "Epoch 7191/10000, Training Loss: 0.024466216564178467, Test Loss: 0.10794461518526077\n", "Epoch 7192/10000, Training Loss: 0.024461211636662483, Test Loss: 0.10796787589788437\n", "Epoch 7193/10000, Training Loss: 0.024456270039081573, Test Loss: 0.107989601790905\n", "Epoch 7194/10000, Training Loss: 0.02445126883685589, Test Loss: 0.1080499142408371\n", "Epoch 7195/10000, Training Loss: 0.02444631978869438, Test Loss: 0.10804383456707001\n", "Epoch 7196/10000, Training Loss: 0.024441348388791084, Test Loss: 0.10811945796012878\n", "Epoch 7197/10000, Training Loss: 0.024436334148049355, Test Loss: 0.10811170190572739\n", "Epoch 7198/10000, Training Loss: 0.024431373924016953, Test Loss: 0.10817725956439972\n", "Epoch 7199/10000, Training Loss: 0.024426404386758804, Test Loss: 0.1081886813044548\n", "Epoch 7200/10000, Training Loss: 0.024421438574790955, Test Loss: 0.10823138058185577\n", "Epoch 7201/10000, Training Loss: 0.02441648207604885, Test Loss: 0.10826404392719269\n", "Epoch 7202/10000, Training Loss: 0.024411516264081, Test Loss: 0.10828763246536255\n", "Epoch 7203/10000, Training Loss: 0.02440652623772621, Test Loss: 0.10833723843097687\n", "Epoch 7204/10000, Training Loss: 0.02440156787633896, Test Loss: 0.10834856331348419\n", "Epoch 7205/10000, Training Loss: 0.024396570399403572, Test Loss: 0.10840478539466858\n", "Epoch 7206/10000, Training Loss: 0.02439163066446781, Test Loss: 0.10841570794582367\n", "Epoch 7207/10000, Training Loss: 0.024386657401919365, Test Loss: 0.10846677422523499\n", "Epoch 7208/10000, Training Loss: 0.02438167855143547, Test Loss: 0.10848720371723175\n", "Epoch 7209/10000, Training Loss: 0.02437678910791874, Test Loss: 0.10852651298046112\n", "Epoch 7210/10000, Training Loss: 0.024371791630983353, Test Loss: 0.10855883359909058\n", "Epoch 7211/10000, Training Loss: 0.02436683140695095, Test Loss: 0.10858771950006485\n", "Epoch 7212/10000, Training Loss: 0.024361880496144295, Test Loss: 0.10862872004508972\n", "Epoch 7213/10000, Training Loss: 0.024356940761208534, Test Loss: 0.10865136235952377\n", "Epoch 7214/10000, Training Loss: 0.024352002888917923, Test Loss: 0.10869567841291428\n", "Epoch 7215/10000, Training Loss: 0.02434704825282097, Test Loss: 0.10871795564889908\n", "Epoch 7216/10000, Training Loss: 0.024342089891433716, Test Loss: 0.1087602972984314\n", "Epoch 7217/10000, Training Loss: 0.024337150156497955, Test Loss: 0.10878640413284302\n", "Epoch 7218/10000, Training Loss: 0.024332214146852493, Test Loss: 0.10882400721311569\n", "Epoch 7219/10000, Training Loss: 0.024327171966433525, Test Loss: 0.10885535180568695\n", "Epoch 7220/10000, Training Loss: 0.024322308599948883, Test Loss: 0.10888995975255966\n", "Epoch 7221/10000, Training Loss: 0.024317355826497078, Test Loss: 0.1089201346039772\n", "Epoch 7222/10000, Training Loss: 0.024312414228916168, Test Loss: 0.1089576706290245\n", "Epoch 7223/10000, Training Loss: 0.024307480081915855, Test Loss: 0.10898563265800476\n", "Epoch 7224/10000, Training Loss: 0.02430254966020584, Test Loss: 0.10902521759271622\n", "Epoch 7225/10000, Training Loss: 0.024297624826431274, Test Loss: 0.10905163735151291\n", "Epoch 7226/10000, Training Loss: 0.02429264783859253, Test Loss: 0.10909188538789749\n", "Epoch 7227/10000, Training Loss: 0.02428772859275341, Test Loss: 0.10911852866411209\n", "Epoch 7228/10000, Training Loss: 0.02428279258310795, Test Loss: 0.10915767401456833\n", "Epoch 7229/10000, Training Loss: 0.02427782118320465, Test Loss: 0.1091865673661232\n", "Epoch 7230/10000, Training Loss: 0.02427290566265583, Test Loss: 0.1092231348156929\n", "Epoch 7231/10000, Training Loss: 0.02426796220242977, Test Loss: 0.10925500094890594\n", "Epoch 7232/10000, Training Loss: 0.02426305040717125, Test Loss: 0.10928825289011002\n", "Epoch 7233/10000, Training Loss: 0.02425810880959034, Test Loss: 0.10932347923517227\n", "Epoch 7234/10000, Training Loss: 0.02425319142639637, Test Loss: 0.10935331881046295\n", "Epoch 7235/10000, Training Loss: 0.024248259142041206, Test Loss: 0.10939241945743561\n", "Epoch 7236/10000, Training Loss: 0.024243302643299103, Test Loss: 0.10941865295171738\n", "Epoch 7237/10000, Training Loss: 0.024238402023911476, Test Loss: 0.10946065187454224\n", "Epoch 7238/10000, Training Loss: 0.02423352748155594, Test Loss: 0.10948469489812851\n", "Epoch 7239/10000, Training Loss: 0.024228563532233238, Test Loss: 0.10952834039926529\n", "Epoch 7240/10000, Training Loss: 0.024223649874329567, Test Loss: 0.1095515787601471\n", "Epoch 7241/10000, Training Loss: 0.02421872317790985, Test Loss: 0.10959562659263611\n", "Epoch 7242/10000, Training Loss: 0.024213803932070732, Test Loss: 0.10961892455816269\n", "Epoch 7243/10000, Training Loss: 0.024208875373005867, Test Loss: 0.10966256260871887\n", "Epoch 7244/10000, Training Loss: 0.024203931912779808, Test Loss: 0.10968665778636932\n", "Epoch 7245/10000, Training Loss: 0.024199016392230988, Test Loss: 0.10972942411899567\n", "Epoch 7246/10000, Training Loss: 0.024194123223423958, Test Loss: 0.10975466668605804\n", "Epoch 7247/10000, Training Loss: 0.024189217016100883, Test Loss: 0.1097964346408844\n", "Epoch 7248/10000, Training Loss: 0.024184299632906914, Test Loss: 0.10982241481542587\n", "Epoch 7249/10000, Training Loss: 0.024179384112358093, Test Loss: 0.10986389219760895\n", "Epoch 7250/10000, Training Loss: 0.024174442514777184, Test Loss: 0.10988965630531311\n", "Epoch 7251/10000, Training Loss: 0.02416960708796978, Test Loss: 0.10993180423974991\n", "Epoch 7252/10000, Training Loss: 0.02416464313864708, Test Loss: 0.10995697230100632\n", "Epoch 7253/10000, Training Loss: 0.02415972389280796, Test Loss: 0.10999967157840729\n", "Epoch 7254/10000, Training Loss: 0.024154841899871826, Test Loss: 0.1100246012210846\n", "Epoch 7255/10000, Training Loss: 0.0241499412804842, Test Loss: 0.11006750911474228\n", "Epoch 7256/10000, Training Loss: 0.02414502017199993, Test Loss: 0.11009223759174347\n", "Epoch 7257/10000, Training Loss: 0.024140089750289917, Test Loss: 0.11013539880514145\n", "Epoch 7258/10000, Training Loss: 0.024135224521160126, Test Loss: 0.11016039550304413\n", "Epoch 7259/10000, Training Loss: 0.024130331352353096, Test Loss: 0.11020293086767197\n", "Epoch 7260/10000, Training Loss: 0.02412540651857853, Test Loss: 0.11022873222827911\n", "Epoch 7261/10000, Training Loss: 0.024120543152093887, Test Loss: 0.11027046293020248\n", "Epoch 7262/10000, Training Loss: 0.02411559224128723, Test Loss: 0.11029720306396484\n", "Epoch 7263/10000, Training Loss: 0.02411068230867386, Test Loss: 0.11033806949853897\n", "Epoch 7264/10000, Training Loss: 0.024105818942189217, Test Loss: 0.1103658676147461\n", "Epoch 7265/10000, Training Loss: 0.02410092018544674, Test Loss: 0.11040572077035904\n", "Epoch 7266/10000, Training Loss: 0.0240960493683815, Test Loss: 0.11043473333120346\n", "Epoch 7267/10000, Training Loss: 0.024091165512800217, Test Loss: 0.11047299951314926\n", "Epoch 7268/10000, Training Loss: 0.02408626489341259, Test Loss: 0.11050393432378769\n", "Epoch 7269/10000, Training Loss: 0.024081360548734665, Test Loss: 0.11054062098264694\n", "Epoch 7270/10000, Training Loss: 0.02407647855579853, Test Loss: 0.11057254672050476\n", "Epoch 7271/10000, Training Loss: 0.024071574211120605, Test Loss: 0.11060892790555954\n", "Epoch 7272/10000, Training Loss: 0.02406669221818447, Test Loss: 0.11064085364341736\n", "Epoch 7273/10000, Training Loss: 0.0240617748349905, Test Loss: 0.11067727208137512\n", "Epoch 7274/10000, Training Loss: 0.024056971073150635, Test Loss: 0.1107097640633583\n", "Epoch 7275/10000, Training Loss: 0.024052051827311516, Test Loss: 0.11074535548686981\n", "Epoch 7276/10000, Training Loss: 0.02404715307056904, Test Loss: 0.11077900975942612\n", "Epoch 7277/10000, Training Loss: 0.024042312055826187, Test Loss: 0.11081334203481674\n", "Epoch 7278/10000, Training Loss: 0.024037420749664307, Test Loss: 0.11084822565317154\n", "Epoch 7279/10000, Training Loss: 0.024032507091760635, Test Loss: 0.11087799072265625\n", "Epoch 7280/10000, Training Loss: 0.024027662351727486, Test Loss: 0.11090007424354553\n", "Epoch 7281/10000, Training Loss: 0.024022791534662247, Test Loss: 0.11091943085193634\n", "Epoch 7282/10000, Training Loss: 0.024017881602048874, Test Loss: 0.11094215512275696\n", "Epoch 7283/10000, Training Loss: 0.024013036862015724, Test Loss: 0.1109614148736\n", "Epoch 7284/10000, Training Loss: 0.02400811016559601, Test Loss: 0.11098406463861465\n", "Epoch 7285/10000, Training Loss: 0.02400331012904644, Test Loss: 0.11100293695926666\n", "Epoch 7286/10000, Training Loss: 0.0239984393119812, Test Loss: 0.11102689057588577\n", "Epoch 7287/10000, Training Loss: 0.02399354800581932, Test Loss: 0.11104384064674377\n", "Epoch 7288/10000, Training Loss: 0.023988647386431694, Test Loss: 0.1110706701874733\n", "Epoch 7289/10000, Training Loss: 0.023983800783753395, Test Loss: 0.11108379065990448\n", "Epoch 7290/10000, Training Loss: 0.02397892065346241, Test Loss: 0.11111587285995483\n", "Epoch 7291/10000, Training Loss: 0.023974087089300156, Test Loss: 0.11112219840288162\n", "Epoch 7292/10000, Training Loss: 0.02396921068429947, Test Loss: 0.1111629456281662\n", "Epoch 7293/10000, Training Loss: 0.023964347317814827, Test Loss: 0.11115885525941849\n", "Epoch 7294/10000, Training Loss: 0.02395947091281414, Test Loss: 0.11121265590190887\n", "Epoch 7295/10000, Training Loss: 0.023954639211297035, Test Loss: 0.11119160801172256\n", "Epoch 7296/10000, Training Loss: 0.023949746042490005, Test Loss: 0.11126817017793655\n", "Epoch 7297/10000, Training Loss: 0.023944897577166557, Test Loss: 0.11121715605258942\n", "Epoch 7298/10000, Training Loss: 0.02394006960093975, Test Loss: 0.11133314669132233\n", "Epoch 7299/10000, Training Loss: 0.023935260251164436, Test Loss: 0.11123049259185791\n", "Epoch 7300/10000, Training Loss: 0.0239303857088089, Test Loss: 0.11141455173492432\n", "Epoch 7301/10000, Training Loss: 0.02392553724348545, Test Loss: 0.1112220361828804\n", "Epoch 7302/10000, Training Loss: 0.02392074652016163, Test Loss: 0.11152534186840057\n", "Epoch 7303/10000, Training Loss: 0.023915961384773254, Test Loss: 0.11117386072874069\n", "Epoch 7304/10000, Training Loss: 0.023911254480481148, Test Loss: 0.11168946325778961\n", "Epoch 7305/10000, Training Loss: 0.023906631395220757, Test Loss: 0.11105597019195557\n", "Epoch 7306/10000, Training Loss: 0.023902159184217453, Test Loss: 0.11193957179784775\n", "Epoch 7307/10000, Training Loss: 0.023898009210824966, Test Loss: 0.11083619296550751\n", "Epoch 7308/10000, Training Loss: 0.02389434352517128, Test Loss: 0.11228814721107483\n", "Epoch 7309/10000, Training Loss: 0.02389104664325714, Test Loss: 0.11055359989404678\n", "Epoch 7310/10000, Training Loss: 0.02388829179108143, Test Loss: 0.11258614808320999\n", "Epoch 7311/10000, Training Loss: 0.023884696885943413, Test Loss: 0.11051544547080994\n", "Epoch 7312/10000, Training Loss: 0.02387971803545952, Test Loss: 0.11240215599536896\n", "Epoch 7313/10000, Training Loss: 0.023872142657637596, Test Loss: 0.11112780123949051\n", "Epoch 7314/10000, Training Loss: 0.02386406436562538, Test Loss: 0.11163781583309174\n", "Epoch 7315/10000, Training Loss: 0.02385774999856949, Test Loss: 0.11201541870832443\n", "Epoch 7316/10000, Training Loss: 0.023853952065110207, Test Loss: 0.11100144684314728\n", "Epoch 7317/10000, Training Loss: 0.023851050063967705, Test Loss: 0.112384133040905\n", "Epoch 7318/10000, Training Loss: 0.02384682185947895, Test Loss: 0.11112349480390549\n", "Epoch 7319/10000, Training Loss: 0.023840731009840965, Test Loss: 0.11197088658809662\n", "Epoch 7320/10000, Training Loss: 0.02383413165807724, Test Loss: 0.11183081567287445\n", "Epoch 7321/10000, Training Loss: 0.023828959092497826, Test Loss: 0.111358143389225\n", "Epoch 7322/10000, Training Loss: 0.02382521517574787, Test Loss: 0.11229675263166428\n", "Epoch 7323/10000, Training Loss: 0.023821182548999786, Test Loss: 0.11130744963884354\n", "Epoch 7324/10000, Training Loss: 0.02381613664329052, Test Loss: 0.11208221316337585\n", "Epoch 7325/10000, Training Loss: 0.023810209706425667, Test Loss: 0.11183314025402069\n", "Epoch 7326/10000, Training Loss: 0.023804854601621628, Test Loss: 0.11159193515777588\n", "Epoch 7327/10000, Training Loss: 0.023800566792488098, Test Loss: 0.11225784569978714\n", "Epoch 7328/10000, Training Loss: 0.02379637211561203, Test Loss: 0.11150362342596054\n", "Epoch 7329/10000, Training Loss: 0.023791542276740074, Test Loss: 0.11214236915111542\n", "Epoch 7330/10000, Training Loss: 0.023786164820194244, Test Loss: 0.11190076172351837\n", "Epoch 7331/10000, Training Loss: 0.023780958727002144, Test Loss: 0.11176751554012299\n", "Epoch 7332/10000, Training Loss: 0.023776384070515633, Test Loss: 0.11225850135087967\n", "Epoch 7333/10000, Training Loss: 0.023771990090608597, Test Loss: 0.1116834357380867\n", "Epoch 7334/10000, Training Loss: 0.02376720868051052, Test Loss: 0.11219897866249084\n", "Epoch 7335/10000, Training Loss: 0.023762090131640434, Test Loss: 0.11198792606592178\n", "Epoch 7336/10000, Training Loss: 0.023757053539156914, Test Loss: 0.1119183599948883\n", "Epoch 7337/10000, Training Loss: 0.023752350360155106, Test Loss: 0.11228349059820175\n", "Epoch 7338/10000, Training Loss: 0.023747818544507027, Test Loss: 0.11184733361005783\n", "Epoch 7339/10000, Training Loss: 0.02374309115111828, Test Loss: 0.11226236075162888\n", "Epoch 7340/10000, Training Loss: 0.02373812347650528, Test Loss: 0.1120809018611908\n", "Epoch 7341/10000, Training Loss: 0.023733124136924744, Test Loss: 0.11205679923295975\n", "Epoch 7342/10000, Training Loss: 0.023728428408503532, Test Loss: 0.11232603341341019\n", "Epoch 7343/10000, Training Loss: 0.023723792284727097, Test Loss: 0.11199773102998734\n", "Epoch 7344/10000, Training Loss: 0.02371903508901596, Test Loss: 0.11233417689800262\n", "Epoch 7345/10000, Training Loss: 0.023714203387498856, Test Loss: 0.11217557638883591\n", "Epoch 7346/10000, Training Loss: 0.023709284141659737, Test Loss: 0.11218870431184769\n", "Epoch 7347/10000, Training Loss: 0.023704584687948227, Test Loss: 0.1123802438378334\n", "Epoch 7348/10000, Training Loss: 0.02369983121752739, Test Loss: 0.1121385470032692\n", "Epoch 7349/10000, Training Loss: 0.02369513362646103, Test Loss: 0.11241299659013748\n", "Epoch 7350/10000, Training Loss: 0.023690298199653625, Test Loss: 0.11227114498615265\n", "Epoch 7351/10000, Training Loss: 0.023685460910201073, Test Loss: 0.11231628060340881\n", "Epoch 7352/10000, Training Loss: 0.023680688813328743, Test Loss: 0.11244303733110428\n", "Epoch 7353/10000, Training Loss: 0.023675955832004547, Test Loss: 0.11227354407310486\n", "Epoch 7354/10000, Training Loss: 0.023671254515647888, Test Loss: 0.11249586939811707\n", "Epoch 7355/10000, Training Loss: 0.023666467517614365, Test Loss: 0.11236920952796936\n", "Epoch 7356/10000, Training Loss: 0.023661641404032707, Test Loss: 0.1124391183257103\n", "Epoch 7357/10000, Training Loss: 0.02365686558187008, Test Loss: 0.11251366883516312\n", "Epoch 7358/10000, Training Loss: 0.02365213632583618, Test Loss: 0.11240417510271072\n", "Epoch 7359/10000, Training Loss: 0.02364739030599594, Test Loss: 0.1125807836651802\n", "Epoch 7360/10000, Training Loss: 0.023642655462026596, Test Loss: 0.11247091740369797\n", "Epoch 7361/10000, Training Loss: 0.023637887090444565, Test Loss: 0.11255770176649094\n", "Epoch 7362/10000, Training Loss: 0.02363315410912037, Test Loss: 0.11259011924266815\n", "Epoch 7363/10000, Training Loss: 0.023628370836377144, Test Loss: 0.11253275722265244\n", "Epoch 7364/10000, Training Loss: 0.02362358570098877, Test Loss: 0.11266632378101349\n", "Epoch 7365/10000, Training Loss: 0.023618854582309723, Test Loss: 0.11257698386907578\n", "Epoch 7366/10000, Training Loss: 0.023614108562469482, Test Loss: 0.11267170310020447\n", "Epoch 7367/10000, Training Loss: 0.02360936440527439, Test Loss: 0.11267313361167908\n", "Epoch 7368/10000, Training Loss: 0.02360459417104721, Test Loss: 0.11265808343887329\n", "Epoch 7369/10000, Training Loss: 0.023599863052368164, Test Loss: 0.11275359988212585\n", "Epoch 7370/10000, Training Loss: 0.023595118895173073, Test Loss: 0.11268633604049683\n", "Epoch 7371/10000, Training Loss: 0.023590343073010445, Test Loss: 0.11278172582387924\n", "Epoch 7372/10000, Training Loss: 0.023585597053170204, Test Loss: 0.11276151239871979\n", "Epoch 7373/10000, Training Loss: 0.023580873385071754, Test Loss: 0.11278200894594193\n", "Epoch 7374/10000, Training Loss: 0.02357610873878002, Test Loss: 0.11284023523330688\n", "Epoch 7375/10000, Training Loss: 0.023571399971842766, Test Loss: 0.11280135810375214\n", "Epoch 7376/10000, Training Loss: 0.02356664091348648, Test Loss: 0.11288561671972275\n", "Epoch 7377/10000, Training Loss: 0.02356189861893654, Test Loss: 0.11285817623138428\n", "Epoch 7378/10000, Training Loss: 0.023557187989354134, Test Loss: 0.11290069669485092\n", "Epoch 7379/10000, Training Loss: 0.023552462458610535, Test Loss: 0.11293051391839981\n", "Epoch 7380/10000, Training Loss: 0.0235477052628994, Test Loss: 0.1129181832075119\n", "Epoch 7381/10000, Training Loss: 0.023542998358607292, Test Loss: 0.11298559606075287\n", "Epoch 7382/10000, Training Loss: 0.023538265377283096, Test Loss: 0.11296188086271286\n", "Epoch 7383/10000, Training Loss: 0.02353346347808838, Test Loss: 0.11301475763320923\n", "Epoch 7384/10000, Training Loss: 0.02352876216173172, Test Loss: 0.11302408576011658\n", "Epoch 7385/10000, Training Loss: 0.023524004966020584, Test Loss: 0.11303646117448807\n", "Epoch 7386/10000, Training Loss: 0.023519368842244148, Test Loss: 0.11308315396308899\n", "Epoch 7387/10000, Training Loss: 0.02351459674537182, Test Loss: 0.11307090520858765\n", "Epoch 7388/10000, Training Loss: 0.02350987121462822, Test Loss: 0.11312423646450043\n", "Epoch 7389/10000, Training Loss: 0.023505117744207382, Test Loss: 0.11312326788902283\n", "Epoch 7390/10000, Training Loss: 0.023500414565205574, Test Loss: 0.11315269768238068\n", "Epoch 7391/10000, Training Loss: 0.023495689034461975, Test Loss: 0.11318084597587585\n", "Epoch 7392/10000, Training Loss: 0.023490972816944122, Test Loss: 0.11318407952785492\n", "Epoch 7393/10000, Training Loss: 0.02348623238503933, Test Loss: 0.11322929710149765\n", "Epoch 7394/10000, Training Loss: 0.023481514304876328, Test Loss: 0.11322764307260513\n", "Epoch 7395/10000, Training Loss: 0.023476794362068176, Test Loss: 0.11326640844345093\n", "Epoch 7396/10000, Training Loss: 0.023472066968679428, Test Loss: 0.11328020691871643\n", "Epoch 7397/10000, Training Loss: 0.023467352613806725, Test Loss: 0.11329925805330276\n", "Epoch 7398/10000, Training Loss: 0.023462653160095215, Test Loss: 0.11333218961954117\n", "Epoch 7399/10000, Training Loss: 0.023457907140254974, Test Loss: 0.11333683133125305\n", "Epoch 7400/10000, Training Loss: 0.023453185334801674, Test Loss: 0.11337679624557495\n", "Epoch 7401/10000, Training Loss: 0.023448524996638298, Test Loss: 0.11338306218385696\n", "Epoch 7402/10000, Training Loss: 0.023443764075636864, Test Loss: 0.1134144738316536\n", "Epoch 7403/10000, Training Loss: 0.0234390776604414, Test Loss: 0.11343364417552948\n", "Epoch 7404/10000, Training Loss: 0.023434337228536606, Test Loss: 0.11345075070858002\n", "Epoch 7405/10000, Training Loss: 0.023429591208696365, Test Loss: 0.113482765853405\n", "Epoch 7406/10000, Training Loss: 0.023424888029694557, Test Loss: 0.11349188536405563\n", "Epoch 7407/10000, Training Loss: 0.023420225828886032, Test Loss: 0.11352567374706268\n", "Epoch 7408/10000, Training Loss: 0.023415477946400642, Test Loss: 0.11353883147239685\n", "Epoch 7409/10000, Training Loss: 0.023410780355334282, Test Loss: 0.1135646402835846\n", "Epoch 7410/10000, Training Loss: 0.023406054824590683, Test Loss: 0.1135873794555664\n", "Epoch 7411/10000, Training Loss: 0.023401373997330666, Test Loss: 0.11360486596822739\n", "Epoch 7412/10000, Training Loss: 0.023396683856844902, Test Loss: 0.11363396793603897\n", "Epoch 7413/10000, Training Loss: 0.023391950875520706, Test Loss: 0.11364739388227463\n", "Epoch 7414/10000, Training Loss: 0.023387255147099495, Test Loss: 0.11367781460285187\n", "Epoch 7415/10000, Training Loss: 0.02338256686925888, Test Loss: 0.11369331181049347\n", "Epoch 7416/10000, Training Loss: 0.023377878591418266, Test Loss: 0.11371887475252151\n", "Epoch 7417/10000, Training Loss: 0.023373164236545563, Test Loss: 0.11374104768037796\n", "Epoch 7418/10000, Training Loss: 0.023368436843156815, Test Loss: 0.11375963687896729\n", "Epoch 7419/10000, Training Loss: 0.0233637522906065, Test Loss: 0.11378810554742813\n", "Epoch 7420/10000, Training Loss: 0.023359054699540138, Test Loss: 0.11380206048488617\n", "Epoch 7421/10000, Training Loss: 0.02335435152053833, Test Loss: 0.11383285373449326\n", "Epoch 7422/10000, Training Loss: 0.02334967441856861, Test Loss: 0.11384762823581696\n", "Epoch 7423/10000, Training Loss: 0.023344971239566803, Test Loss: 0.11387535184621811\n", "Epoch 7424/10000, Training Loss: 0.023340296000242233, Test Loss: 0.11389482766389847\n", "Epoch 7425/10000, Training Loss: 0.023335551843047142, Test Loss: 0.11391709744930267\n", "Epoch 7426/10000, Training Loss: 0.023330865427851677, Test Loss: 0.11394185572862625\n", "Epoch 7427/10000, Training Loss: 0.023326197639107704, Test Loss: 0.11396049708127975\n", "Epoch 7428/10000, Training Loss: 0.023321490734815598, Test Loss: 0.11398673057556152\n", "Epoch 7429/10000, Training Loss: 0.02331683598458767, Test Loss: 0.11400631070137024\n", "Epoch 7430/10000, Training Loss: 0.023312119767069817, Test Loss: 0.11402970552444458\n", "Epoch 7431/10000, Training Loss: 0.023307450115680695, Test Loss: 0.11405320465564728\n", "Epoch 7432/10000, Training Loss: 0.02330273576080799, Test Loss: 0.11407317966222763\n", "Epoch 7433/10000, Training Loss: 0.023298082873225212, Test Loss: 0.11409904807806015\n", "Epoch 7434/10000, Training Loss: 0.02329334430396557, Test Loss: 0.11411819607019424\n", "Epoch 7435/10000, Training Loss: 0.023288697004318237, Test Loss: 0.11414322257041931\n", "Epoch 7436/10000, Training Loss: 0.02328399568796158, Test Loss: 0.11416463553905487\n", "Epoch 7437/10000, Training Loss: 0.023279357701539993, Test Loss: 0.1141873151063919\n", "Epoch 7438/10000, Training Loss: 0.02327468991279602, Test Loss: 0.11421076208353043\n", "Epoch 7439/10000, Training Loss: 0.023269962519407272, Test Loss: 0.11423184722661972\n", "Epoch 7440/10000, Training Loss: 0.02326527237892151, Test Loss: 0.11425662785768509\n", "Epoch 7441/10000, Training Loss: 0.02326062135398388, Test Loss: 0.11427711695432663\n", "Epoch 7442/10000, Training Loss: 0.02325594797730446, Test Loss: 0.11430227011442184\n", "Epoch 7443/10000, Training Loss: 0.023251235485076904, Test Loss: 0.11432249844074249\n", "Epoch 7444/10000, Training Loss: 0.02324657328426838, Test Loss: 0.11434751003980637\n", "Epoch 7445/10000, Training Loss: 0.02324192225933075, Test Loss: 0.11436866968870163\n", "Epoch 7446/10000, Training Loss: 0.023237256333231926, Test Loss: 0.11439216881990433\n", "Epoch 7447/10000, Training Loss: 0.023232564330101013, Test Loss: 0.1144154742360115\n", "Epoch 7448/10000, Training Loss: 0.02322792075574398, Test Loss: 0.11443708837032318\n", "Epoch 7449/10000, Training Loss: 0.023223217576742172, Test Loss: 0.11446170508861542\n", "Epoch 7450/10000, Training Loss: 0.0232185497879982, Test Loss: 0.11448290199041367\n", "Epoch 7451/10000, Training Loss: 0.023213829845190048, Test Loss: 0.11450733244419098\n", "Epoch 7452/10000, Training Loss: 0.023209214210510254, Test Loss: 0.11452912539243698\n", "Epoch 7453/10000, Training Loss: 0.023204494267702103, Test Loss: 0.11455285549163818\n", "Epoch 7454/10000, Training Loss: 0.023199845105409622, Test Loss: 0.11457575112581253\n", "Epoch 7455/10000, Training Loss: 0.023195186629891396, Test Loss: 0.11459825932979584\n", "Epoch 7456/10000, Training Loss: 0.023190535604953766, Test Loss: 0.11462283879518509\n", "Epoch 7457/10000, Training Loss: 0.023185867816209793, Test Loss: 0.11464349180459976\n", "Epoch 7458/10000, Training Loss: 0.02318117953836918, Test Loss: 0.11466962099075317\n", "Epoch 7459/10000, Training Loss: 0.023176534101366997, Test Loss: 0.11468984186649323\n", "Epoch 7460/10000, Training Loss: 0.02317187935113907, Test Loss: 0.1147153452038765\n", "Epoch 7461/10000, Training Loss: 0.02316717617213726, Test Loss: 0.11473709344863892\n", "Epoch 7462/10000, Training Loss: 0.023162543773651123, Test Loss: 0.1147608682513237\n", "Epoch 7463/10000, Training Loss: 0.023157861083745956, Test Loss: 0.11478490382432938\n", "Epoch 7464/10000, Training Loss: 0.023153208196163177, Test Loss: 0.11480578035116196\n", "Epoch 7465/10000, Training Loss: 0.023148585110902786, Test Loss: 0.11483300477266312\n", "Epoch 7466/10000, Training Loss: 0.023143872618675232, Test Loss: 0.11485158652067184\n", "Epoch 7467/10000, Training Loss: 0.023139242082834244, Test Loss: 0.11488011479377747\n", "Epoch 7468/10000, Training Loss: 0.023134615272283554, Test Loss: 0.11489831656217575\n", "Epoch 7469/10000, Training Loss: 0.023129958659410477, Test Loss: 0.11492664366960526\n", "Epoch 7470/10000, Training Loss: 0.02312527969479561, Test Loss: 0.11494538933038712\n", "Epoch 7471/10000, Training Loss: 0.023120606318116188, Test Loss: 0.11497309058904648\n", "Epoch 7472/10000, Training Loss: 0.023115985095500946, Test Loss: 0.11499308049678802\n", "Epoch 7473/10000, Training Loss: 0.023111337795853615, Test Loss: 0.11501961946487427\n", "Epoch 7474/10000, Training Loss: 0.023106679320335388, Test Loss: 0.11504034698009491\n", "Epoch 7475/10000, Training Loss: 0.02310200035572052, Test Loss: 0.11506659537553787\n", "Epoch 7476/10000, Training Loss: 0.023097382858395576, Test Loss: 0.1150876134634018\n", "Epoch 7477/10000, Training Loss: 0.02309277281165123, Test Loss: 0.11511385440826416\n", "Epoch 7478/10000, Training Loss: 0.023088056594133377, Test Loss: 0.11513497680425644\n", "Epoch 7479/10000, Training Loss: 0.023083427920937538, Test Loss: 0.11516057699918747\n", "Epoch 7480/10000, Training Loss: 0.02307879365980625, Test Loss: 0.11518310755491257\n", "Epoch 7481/10000, Training Loss: 0.023074189200997353, Test Loss: 0.11520711332559586\n", "Epoch 7482/10000, Training Loss: 0.023069534450769424, Test Loss: 0.11523136496543884\n", "Epoch 7483/10000, Training Loss: 0.023064831271767616, Test Loss: 0.11525393277406693\n", "Epoch 7484/10000, Training Loss: 0.023060210049152374, Test Loss: 0.11527961492538452\n", "Epoch 7485/10000, Training Loss: 0.02305556833744049, Test Loss: 0.11530108004808426\n", "Epoch 7486/10000, Training Loss: 0.023050928488373756, Test Loss: 0.11532741785049438\n", "Epoch 7487/10000, Training Loss: 0.023046299815177917, Test Loss: 0.11534879356622696\n", "Epoch 7488/10000, Training Loss: 0.023041633889079094, Test Loss: 0.11537499725818634\n", "Epoch 7489/10000, Training Loss: 0.023037035018205643, Test Loss: 0.11539651453495026\n", "Epoch 7490/10000, Training Loss: 0.02303239516913891, Test Loss: 0.1154230386018753\n", "Epoch 7491/10000, Training Loss: 0.023027760908007622, Test Loss: 0.11544419825077057\n", "Epoch 7492/10000, Training Loss: 0.023023124784231186, Test Loss: 0.11547157168388367\n", "Epoch 7493/10000, Training Loss: 0.023018471896648407, Test Loss: 0.11549147218465805\n", "Epoch 7494/10000, Training Loss: 0.02301386557519436, Test Loss: 0.11552035063505173\n", "Epoch 7495/10000, Training Loss: 0.02300924062728882, Test Loss: 0.11553868651390076\n", "Epoch 7496/10000, Training Loss: 0.023004602640867233, Test Loss: 0.11556945741176605\n", "Epoch 7497/10000, Training Loss: 0.022999973967671394, Test Loss: 0.11558622866868973\n", "Epoch 7498/10000, Training Loss: 0.022995365783572197, Test Loss: 0.11561798304319382\n", "Epoch 7499/10000, Training Loss: 0.022990714758634567, Test Loss: 0.115634486079216\n", "Epoch 7500/10000, Training Loss: 0.022986093536019325, Test Loss: 0.11566679924726486\n", "Epoch 7501/10000, Training Loss: 0.02298142947256565, Test Loss: 0.11568205803632736\n", "Epoch 7502/10000, Training Loss: 0.022976817563176155, Test Loss: 0.11571576446294785\n", "Epoch 7503/10000, Training Loss: 0.022972216829657555, Test Loss: 0.11573037505149841\n", "Epoch 7504/10000, Training Loss: 0.022967582568526268, Test Loss: 0.11576469242572784\n", "Epoch 7505/10000, Training Loss: 0.022962965071201324, Test Loss: 0.11577846854925156\n", "Epoch 7506/10000, Training Loss: 0.022958366200327873, Test Loss: 0.115814208984375\n", "Epoch 7507/10000, Training Loss: 0.022953666746616364, Test Loss: 0.1158260852098465\n", "Epoch 7508/10000, Training Loss: 0.02294902317225933, Test Loss: 0.11586438864469528\n", "Epoch 7509/10000, Training Loss: 0.022944476455450058, Test Loss: 0.11587328463792801\n", "Epoch 7510/10000, Training Loss: 0.02293982543051243, Test Loss: 0.11591488867998123\n", "Epoch 7511/10000, Training Loss: 0.022935185581445694, Test Loss: 0.11592010408639908\n", "Epoch 7512/10000, Training Loss: 0.022930603474378586, Test Loss: 0.1159665584564209\n", "Epoch 7513/10000, Training Loss: 0.022926010191440582, Test Loss: 0.1159658282995224\n", "Epoch 7514/10000, Training Loss: 0.022921394556760788, Test Loss: 0.11601968109607697\n", "Epoch 7515/10000, Training Loss: 0.022916728630661964, Test Loss: 0.11600989103317261\n", "Epoch 7516/10000, Training Loss: 0.02291210927069187, Test Loss: 0.1160750687122345\n", "Epoch 7517/10000, Training Loss: 0.022907523438334465, Test Loss: 0.11605168879032135\n", "Epoch 7518/10000, Training Loss: 0.022902920842170715, Test Loss: 0.11613297462463379\n", "Epoch 7519/10000, Training Loss: 0.022898299619555473, Test Loss: 0.11609040945768356\n", "Epoch 7520/10000, Training Loss: 0.02289368398487568, Test Loss: 0.11619517207145691\n", "Epoch 7521/10000, Training Loss: 0.022889070212841034, Test Loss: 0.11612395942211151\n", "Epoch 7522/10000, Training Loss: 0.02288445644080639, Test Loss: 0.11626417934894562\n", "Epoch 7523/10000, Training Loss: 0.022879907861351967, Test Loss: 0.1161489337682724\n", "Epoch 7524/10000, Training Loss: 0.022875268012285233, Test Loss: 0.11634425818920135\n", "Epoch 7525/10000, Training Loss: 0.02287064865231514, Test Loss: 0.11616026610136032\n", "Epoch 7526/10000, Training Loss: 0.022866101935505867, Test Loss: 0.11644206941127777\n", "Epoch 7527/10000, Training Loss: 0.02286159247159958, Test Loss: 0.11614914238452911\n", "Epoch 7528/10000, Training Loss: 0.02285701408982277, Test Loss: 0.11656833440065384\n", "Epoch 7529/10000, Training Loss: 0.022852525115013123, Test Loss: 0.11610241234302521\n", "Epoch 7530/10000, Training Loss: 0.022848082706332207, Test Loss: 0.11673744022846222\n", "Epoch 7531/10000, Training Loss: 0.02284373715519905, Test Loss: 0.11600375175476074\n", "Epoch 7532/10000, Training Loss: 0.02283954806625843, Test Loss: 0.11696511507034302\n", "Epoch 7533/10000, Training Loss: 0.022835562005639076, Test Loss: 0.11584318429231644\n", "Epoch 7534/10000, Training Loss: 0.02283182181417942, Test Loss: 0.11724250763654709\n", "Epoch 7535/10000, Training Loss: 0.022828195244073868, Test Loss: 0.1156628280878067\n", "Epoch 7536/10000, Training Loss: 0.022824889048933983, Test Loss: 0.11747106909751892\n", "Epoch 7537/10000, Training Loss: 0.02282099425792694, Test Loss: 0.11563462764024734\n", "Epoch 7538/10000, Training Loss: 0.022816600278019905, Test Loss: 0.11742237210273743\n", "Epoch 7539/10000, Training Loss: 0.022810766473412514, Test Loss: 0.11599040776491165\n", "Epoch 7540/10000, Training Loss: 0.0228042583912611, Test Loss: 0.1169770210981369\n", "Epoch 7541/10000, Training Loss: 0.022797897458076477, Test Loss: 0.11664708703756332\n", "Epoch 7542/10000, Training Loss: 0.022792594507336617, Test Loss: 0.11641545593738556\n", "Epoch 7543/10000, Training Loss: 0.022788507863879204, Test Loss: 0.11719533056020737\n", "Epoch 7544/10000, Training Loss: 0.02278495952486992, Test Loss: 0.11614160984754562\n", "Epoch 7545/10000, Training Loss: 0.022781118750572205, Test Loss: 0.11732813715934753\n", "Epoch 7546/10000, Training Loss: 0.022776436060667038, Test Loss: 0.11633872985839844\n", "Epoch 7547/10000, Training Loss: 0.02277102880179882, Test Loss: 0.1170334741473198\n", "Epoch 7548/10000, Training Loss: 0.022765537723898888, Test Loss: 0.11683708429336548\n", "Epoch 7549/10000, Training Loss: 0.02276061475276947, Test Loss: 0.1166265457868576\n", "Epoch 7550/10000, Training Loss: 0.022756395861506462, Test Loss: 0.11724137514829636\n", "Epoch 7551/10000, Training Loss: 0.0227524321526289, Test Loss: 0.11647474765777588\n", "Epoch 7552/10000, Training Loss: 0.02274813875555992, Test Loss: 0.11729501187801361\n", "Epoch 7553/10000, Training Loss: 0.022743308916687965, Test Loss: 0.11669009178876877\n", "Epoch 7554/10000, Training Loss: 0.022738277912139893, Test Loss: 0.11705146729946136\n", "Epoch 7555/10000, Training Loss: 0.022733351215720177, Test Loss: 0.11707453429698944\n", "Epoch 7556/10000, Training Loss: 0.022728806361556053, Test Loss: 0.11679339408874512\n", "Epoch 7557/10000, Training Loss: 0.022724485024809837, Test Loss: 0.11732751131057739\n", "Epoch 7558/10000, Training Loss: 0.022720197215676308, Test Loss: 0.11676584929227829\n", "Epoch 7559/10000, Training Loss: 0.02271566353738308, Test Loss: 0.11731178313493729\n", "Epoch 7560/10000, Training Loss: 0.02271091379225254, Test Loss: 0.11698514223098755\n", "Epoch 7561/10000, Training Loss: 0.02270611748099327, Test Loss: 0.11712666600942612\n", "Epoch 7562/10000, Training Loss: 0.02270149253308773, Test Loss: 0.11726697534322739\n", "Epoch 7563/10000, Training Loss: 0.02269698865711689, Test Loss: 0.1169871911406517\n", "Epoch 7564/10000, Training Loss: 0.022692620754241943, Test Loss: 0.11741584539413452\n", "Epoch 7565/10000, Training Loss: 0.022688135504722595, Test Loss: 0.11702968925237656\n", "Epoch 7566/10000, Training Loss: 0.02268354408442974, Test Loss: 0.1173790991306305\n", "Epoch 7567/10000, Training Loss: 0.022678881883621216, Test Loss: 0.11722259223461151\n", "Epoch 7568/10000, Training Loss: 0.02267424389719963, Test Loss: 0.11725710332393646\n", "Epoch 7569/10000, Training Loss: 0.022669678553938866, Test Loss: 0.11742474883794785\n", "Epoch 7570/10000, Training Loss: 0.022665198892354965, Test Loss: 0.11719471961259842\n", "Epoch 7571/10000, Training Loss: 0.022660749033093452, Test Loss: 0.11751855164766312\n", "Epoch 7572/10000, Training Loss: 0.02265615202486515, Test Loss: 0.11726201325654984\n", "Epoch 7573/10000, Training Loss: 0.02265157550573349, Test Loss: 0.11749137938022614\n", "Epoch 7574/10000, Training Loss: 0.02264699526131153, Test Loss: 0.11741913110017776\n", "Epoch 7575/10000, Training Loss: 0.022642450407147408, Test Loss: 0.11742131412029266\n", "Epoch 7576/10000, Training Loss: 0.02263793535530567, Test Loss: 0.11756894737482071\n", "Epoch 7577/10000, Training Loss: 0.022633366286754608, Test Loss: 0.11740083992481232\n", "Epoch 7578/10000, Training Loss: 0.02262893319129944, Test Loss: 0.11764121800661087\n", "Epoch 7579/10000, Training Loss: 0.022624410688877106, Test Loss: 0.11746876686811447\n", "Epoch 7580/10000, Training Loss: 0.02261984348297119, Test Loss: 0.11763416230678558\n", "Epoch 7581/10000, Training Loss: 0.02261527068912983, Test Loss: 0.11759309470653534\n", "Epoch 7582/10000, Training Loss: 0.02261076681315899, Test Loss: 0.11760135740041733\n", "Epoch 7583/10000, Training Loss: 0.022606175392866135, Test Loss: 0.11771169304847717\n", "Epoch 7584/10000, Training Loss: 0.022601686418056488, Test Loss: 0.11760182678699493\n", "Epoch 7585/10000, Training Loss: 0.022597210481762886, Test Loss: 0.11777964234352112\n", "Epoch 7586/10000, Training Loss: 0.022592691704630852, Test Loss: 0.11765941977500916\n", "Epoch 7587/10000, Training Loss: 0.02258814498782158, Test Loss: 0.11779457330703735\n", "Epoch 7588/10000, Training Loss: 0.02258361503481865, Test Loss: 0.11775781214237213\n", "Epoch 7589/10000, Training Loss: 0.022579040378332138, Test Loss: 0.117787666618824\n", "Epoch 7590/10000, Training Loss: 0.022574532777071, Test Loss: 0.11785757541656494\n", "Epoch 7591/10000, Training Loss: 0.02257002890110016, Test Loss: 0.11779804527759552\n", "Epoch 7592/10000, Training Loss: 0.022565478459000587, Test Loss: 0.11792699247598648\n", "Epoch 7593/10000, Training Loss: 0.022560983896255493, Test Loss: 0.11784512549638748\n", "Epoch 7594/10000, Training Loss: 0.022556452080607414, Test Loss: 0.11796075105667114\n", "Epoch 7595/10000, Training Loss: 0.022551946341991425, Test Loss: 0.1179220974445343\n", "Epoch 7596/10000, Training Loss: 0.02254743129014969, Test Loss: 0.11797486990690231\n", "Epoch 7597/10000, Training Loss: 0.02254287153482437, Test Loss: 0.11800701171159744\n", "Epoch 7598/10000, Training Loss: 0.022538360208272934, Test Loss: 0.11799290776252747\n", "Epoch 7599/10000, Training Loss: 0.022533852607011795, Test Loss: 0.11807861179113388\n", "Epoch 7600/10000, Training Loss: 0.022529328241944313, Test Loss: 0.11803100258111954\n", "Epoch 7601/10000, Training Loss: 0.022524846717715263, Test Loss: 0.1181279644370079\n", "Epoch 7602/10000, Training Loss: 0.022520286962389946, Test Loss: 0.11808978021144867\n", "Epoch 7603/10000, Training Loss: 0.022515790536999702, Test Loss: 0.11816024780273438\n", "Epoch 7604/10000, Training Loss: 0.02251126430928707, Test Loss: 0.11816113442182541\n", "Epoch 7605/10000, Training Loss: 0.022506801411509514, Test Loss: 0.1181868389248848\n", "Epoch 7606/10000, Training Loss: 0.02250225841999054, Test Loss: 0.11823172122240067\n", "Epoch 7607/10000, Training Loss: 0.02249774895608425, Test Loss: 0.11822056025266647\n", "Epoch 7608/10000, Training Loss: 0.0224932711571455, Test Loss: 0.11829109489917755\n", "Epoch 7609/10000, Training Loss: 0.02248869463801384, Test Loss: 0.11826684325933456\n", "Epoch 7610/10000, Training Loss: 0.022484207525849342, Test Loss: 0.11833830177783966\n", "Epoch 7611/10000, Training Loss: 0.0224797111004591, Test Loss: 0.118324875831604\n", "Epoch 7612/10000, Training Loss: 0.022475214675068855, Test Loss: 0.11837631464004517\n", "Epoch 7613/10000, Training Loss: 0.02247067540884018, Test Loss: 0.11838845908641815\n", "Epoch 7614/10000, Training Loss: 0.02246619015932083, Test Loss: 0.11841264367103577\n", "Epoch 7615/10000, Training Loss: 0.022461645305156708, Test Loss: 0.11845070868730545\n", "Epoch 7616/10000, Training Loss: 0.0224571842700243, Test Loss: 0.1184532418847084\n", "Epoch 7617/10000, Training Loss: 0.022452713921666145, Test Loss: 0.11850705742835999\n", "Epoch 7618/10000, Training Loss: 0.022448193281888962, Test Loss: 0.11850088834762573\n", "Epoch 7619/10000, Training Loss: 0.022443654015660286, Test Loss: 0.11855686455965042\n", "Epoch 7620/10000, Training Loss: 0.02243916131556034, Test Loss: 0.11855430901050568\n", "Epoch 7621/10000, Training Loss: 0.02243463322520256, Test Loss: 0.11860206723213196\n", "Epoch 7622/10000, Training Loss: 0.022430166602134705, Test Loss: 0.11861151456832886\n", "Epoch 7623/10000, Training Loss: 0.022425685077905655, Test Loss: 0.11864480376243591\n", "Epoch 7624/10000, Training Loss: 0.02242117002606392, Test Loss: 0.11866994202136993\n", "Epoch 7625/10000, Training Loss: 0.022416671738028526, Test Loss: 0.11868834495544434\n", "Epoch 7626/10000, Training Loss: 0.022412164136767387, Test Loss: 0.11872632056474686\n", "Epoch 7627/10000, Training Loss: 0.02240763232111931, Test Loss: 0.11873476952314377\n", "Epoch 7628/10000, Training Loss: 0.0224031712859869, Test Loss: 0.11877921968698502\n", "Epoch 7629/10000, Training Loss: 0.022398672997951508, Test Loss: 0.11878544837236404\n", "Epoch 7630/10000, Training Loss: 0.022394174709916115, Test Loss: 0.11882881075143814\n", "Epoch 7631/10000, Training Loss: 0.02238968387246132, Test Loss: 0.11883911490440369\n", "Epoch 7632/10000, Training Loss: 0.022385157644748688, Test Loss: 0.1188763901591301\n", "Epoch 7633/10000, Training Loss: 0.022380691021680832, Test Loss: 0.11889354884624481\n", "Epoch 7634/10000, Training Loss: 0.022376172244548798, Test Loss: 0.11892392486333847\n", "Epoch 7635/10000, Training Loss: 0.022371726110577583, Test Loss: 0.11894836276769638\n", "Epoch 7636/10000, Training Loss: 0.0223672017455101, Test Loss: 0.11897092312574387\n", "Epoch 7637/10000, Training Loss: 0.022362692281603813, Test Loss: 0.11900349706411362\n", "Epoch 7638/10000, Training Loss: 0.0223582461476326, Test Loss: 0.11901894956827164\n", "Epoch 7639/10000, Training Loss: 0.0223537664860487, Test Loss: 0.11905749142169952\n", "Epoch 7640/10000, Training Loss: 0.022349311038851738, Test Loss: 0.11906852573156357\n", "Epoch 7641/10000, Training Loss: 0.02234480530023575, Test Loss: 0.11911003291606903\n", "Epoch 7642/10000, Training Loss: 0.02234026975929737, Test Loss: 0.11911924928426743\n", "Epoch 7643/10000, Training Loss: 0.022335752844810486, Test Loss: 0.11916167289018631\n", "Epoch 7644/10000, Training Loss: 0.022331317886710167, Test Loss: 0.11917152255773544\n", "Epoch 7645/10000, Training Loss: 0.022326866164803505, Test Loss: 0.11921236664056778\n", "Epoch 7646/10000, Training Loss: 0.02232229709625244, Test Loss: 0.11922425776720047\n", "Epoch 7647/10000, Training Loss: 0.022317852824926376, Test Loss: 0.1192624419927597\n", "Epoch 7648/10000, Training Loss: 0.022313378751277924, Test Loss: 0.1192779690027237\n", "Epoch 7649/10000, Training Loss: 0.022308889776468277, Test Loss: 0.11931207031011581\n", "Epoch 7650/10000, Training Loss: 0.022304410114884377, Test Loss: 0.11933258175849915\n", "Epoch 7651/10000, Training Loss: 0.022299963980913162, Test Loss: 0.11936111748218536\n", "Epoch 7652/10000, Training Loss: 0.022295460104942322, Test Loss: 0.11938753724098206\n", "Epoch 7653/10000, Training Loss: 0.022290972992777824, Test Loss: 0.11941006779670715\n", "Epoch 7654/10000, Training Loss: 0.022286441177129745, Test Loss: 0.11944196373224258\n", "Epoch 7655/10000, Training Loss: 0.02228199876844883, Test Loss: 0.11946060508489609\n", "Epoch 7656/10000, Training Loss: 0.022277535870671272, Test Loss: 0.11949549615383148\n", "Epoch 7657/10000, Training Loss: 0.02227303571999073, Test Loss: 0.119512178003788\n", "Epoch 7658/10000, Training Loss: 0.022268572822213173, Test Loss: 0.11954785883426666\n", "Epoch 7659/10000, Training Loss: 0.022264081984758377, Test Loss: 0.11956445127725601\n", "Epoch 7660/10000, Training Loss: 0.022259632125496864, Test Loss: 0.11960025876760483\n", "Epoch 7661/10000, Training Loss: 0.02225516177713871, Test Loss: 0.11961673945188522\n", "Epoch 7662/10000, Training Loss: 0.02225065603852272, Test Loss: 0.11965294182300568\n", "Epoch 7663/10000, Training Loss: 0.022246183827519417, Test Loss: 0.11966908723115921\n", "Epoch 7664/10000, Training Loss: 0.02224171906709671, Test Loss: 0.11970596015453339\n", "Epoch 7665/10000, Training Loss: 0.022237179800868034, Test Loss: 0.11972114443778992\n", "Epoch 7666/10000, Training Loss: 0.022232750430703163, Test Loss: 0.1197589710354805\n", "Epoch 7667/10000, Training Loss: 0.022228281944990158, Test Loss: 0.11977353692054749\n", "Epoch 7668/10000, Training Loss: 0.022223826497793198, Test Loss: 0.11981245875358582\n", "Epoch 7669/10000, Training Loss: 0.02221936173737049, Test Loss: 0.11982554197311401\n", "Epoch 7670/10000, Training Loss: 0.0222148597240448, Test Loss: 0.11986663937568665\n", "Epoch 7671/10000, Training Loss: 0.022210408002138138, Test Loss: 0.11987660825252533\n", "Epoch 7672/10000, Training Loss: 0.02220592461526394, Test Loss: 0.11992182582616806\n", "Epoch 7673/10000, Training Loss: 0.02220146171748638, Test Loss: 0.11992690712213516\n", "Epoch 7674/10000, Training Loss: 0.022196998819708824, Test Loss: 0.11997833102941513\n", "Epoch 7675/10000, Training Loss: 0.02219255268573761, Test Loss: 0.11997593194246292\n", "Epoch 7676/10000, Training Loss: 0.022188065573573112, Test Loss: 0.12003645300865173\n", "Epoch 7677/10000, Training Loss: 0.022183632478117943, Test Loss: 0.12002304196357727\n", "Epoch 7678/10000, Training Loss: 0.022179145365953445, Test Loss: 0.12009735405445099\n", "Epoch 7679/10000, Training Loss: 0.022174684330821037, Test Loss: 0.1200670599937439\n", "Epoch 7680/10000, Training Loss: 0.022170208394527435, Test Loss: 0.12016197293996811\n", "Epoch 7681/10000, Training Loss: 0.022165736183524132, Test Loss: 0.12010693550109863\n", "Epoch 7682/10000, Training Loss: 0.02216130681335926, Test Loss: 0.12023185938596725\n", "Epoch 7683/10000, Training Loss: 0.022156815975904465, Test Loss: 0.12014003843069077\n", "Epoch 7684/10000, Training Loss: 0.02215236984193325, Test Loss: 0.12031058967113495\n", "Epoch 7685/10000, Training Loss: 0.02214794233441353, Test Loss: 0.1201622411608696\n", "Epoch 7686/10000, Training Loss: 0.022143522277474403, Test Loss: 0.12040312588214874\n", "Epoch 7687/10000, Training Loss: 0.022139061242341995, Test Loss: 0.12016718834638596\n", "Epoch 7688/10000, Training Loss: 0.022134605795145035, Test Loss: 0.12051770836114883\n", "Epoch 7689/10000, Training Loss: 0.022130228579044342, Test Loss: 0.12014471739530563\n", "Epoch 7690/10000, Training Loss: 0.02212589420378208, Test Loss: 0.12066616117954254\n", "Epoch 7691/10000, Training Loss: 0.022121617570519447, Test Loss: 0.12008003145456314\n", "Epoch 7692/10000, Training Loss: 0.022117413580417633, Test Loss: 0.12086602300405502\n", "Epoch 7693/10000, Training Loss: 0.02211330085992813, Test Loss: 0.11995378881692886\n", "Epoch 7694/10000, Training Loss: 0.022109465673565865, Test Loss: 0.12113113701343536\n", "Epoch 7695/10000, Training Loss: 0.022105764597654343, Test Loss: 0.11976427584886551\n", "Epoch 7696/10000, Training Loss: 0.022102514281868935, Test Loss: 0.12143511325120926\n", "Epoch 7697/10000, Training Loss: 0.022099239751696587, Test Loss: 0.11958503723144531\n", "Epoch 7698/10000, Training Loss: 0.022096140310168266, Test Loss: 0.12163271754980087\n", "Epoch 7699/10000, Training Loss: 0.022092077881097794, Test Loss: 0.11964266747236252\n", "Epoch 7700/10000, Training Loss: 0.022087177261710167, Test Loss: 0.12146539241075516\n", "Epoch 7701/10000, Training Loss: 0.02208077348768711, Test Loss: 0.12014470249414444\n", "Epoch 7702/10000, Training Loss: 0.022074036300182343, Test Loss: 0.12090242654085159\n", "Epoch 7703/10000, Training Loss: 0.02206812985241413, Test Loss: 0.12087751924991608\n", "Epoch 7704/10000, Training Loss: 0.02206357754766941, Test Loss: 0.12032942473888397\n", "Epoch 7705/10000, Training Loss: 0.022060172632336617, Test Loss: 0.12138596922159195\n", "Epoch 7706/10000, Training Loss: 0.02205689810216427, Test Loss: 0.1201479583978653\n", "Epoch 7707/10000, Training Loss: 0.022052908316254616, Test Loss: 0.12140534818172455\n", "Epoch 7708/10000, Training Loss: 0.022047800943255424, Test Loss: 0.12047312408685684\n", "Epoch 7709/10000, Training Loss: 0.02204226888716221, Test Loss: 0.12101306766271591\n", "Epoch 7710/10000, Training Loss: 0.02203703671693802, Test Loss: 0.1210329681634903\n", "Epoch 7711/10000, Training Loss: 0.02203267812728882, Test Loss: 0.12060274928808212\n", "Epoch 7712/10000, Training Loss: 0.022028794512152672, Test Loss: 0.12139955163002014\n", "Epoch 7713/10000, Training Loss: 0.022024929523468018, Test Loss: 0.12052956968545914\n", "Epoch 7714/10000, Training Loss: 0.022020574659109116, Test Loss: 0.12136763334274292\n", "Epoch 7715/10000, Training Loss: 0.022015705704689026, Test Loss: 0.12083277106285095\n", "Epoch 7716/10000, Training Loss: 0.02201075293123722, Test Loss: 0.12106820195913315\n", "Epoch 7717/10000, Training Loss: 0.022006118670105934, Test Loss: 0.12124479562044144\n", "Epoch 7718/10000, Training Loss: 0.02200191654264927, Test Loss: 0.12082505226135254\n", "Epoch 7719/10000, Training Loss: 0.021997805684804916, Test Loss: 0.12146132439374924\n", "Epoch 7720/10000, Training Loss: 0.02199358120560646, Test Loss: 0.12085821479558945\n", "Epoch 7721/10000, Training Loss: 0.021989086642861366, Test Loss: 0.12138965725898743\n", "Epoch 7722/10000, Training Loss: 0.021984366700053215, Test Loss: 0.12113066017627716\n", "Epoch 7723/10000, Training Loss: 0.021979767829179764, Test Loss: 0.12117908149957657\n", "Epoch 7724/10000, Training Loss: 0.021975291892886162, Test Loss: 0.12142114341259003\n", "Epoch 7725/10000, Training Loss: 0.021971087902784348, Test Loss: 0.12105976045131683\n", "Epoch 7726/10000, Training Loss: 0.0219668410718441, Test Loss: 0.12154144048690796\n", "Epoch 7727/10000, Training Loss: 0.021962514147162437, Test Loss: 0.12114495784044266\n", "Epoch 7728/10000, Training Loss: 0.021958008408546448, Test Loss: 0.12147206813097\n", "Epoch 7729/10000, Training Loss: 0.021953459829092026, Test Loss: 0.12136813998222351\n", "Epoch 7730/10000, Training Loss: 0.02194896899163723, Test Loss: 0.1213410422205925\n", "Epoch 7731/10000, Training Loss: 0.021944595500826836, Test Loss: 0.12157139182090759\n", "Epoch 7732/10000, Training Loss: 0.021940305829048157, Test Loss: 0.12129643559455872\n", "Epoch 7733/10000, Training Loss: 0.021935999393463135, Test Loss: 0.121646948158741\n", "Epoch 7734/10000, Training Loss: 0.021931558847427368, Test Loss: 0.12139101326465607\n", "Epoch 7735/10000, Training Loss: 0.0219271257519722, Test Loss: 0.12160243839025497\n", "Epoch 7736/10000, Training Loss: 0.021922649815678596, Test Loss: 0.12156455963850021\n", "Epoch 7737/10000, Training Loss: 0.02191827818751335, Test Loss: 0.12153109163045883\n", "Epoch 7738/10000, Training Loss: 0.021913904696702957, Test Loss: 0.12171367555856705\n", "Epoch 7739/10000, Training Loss: 0.021909505128860474, Test Loss: 0.12152360379695892\n", "Epoch 7740/10000, Training Loss: 0.021905187517404556, Test Loss: 0.1217765063047409\n", "Epoch 7741/10000, Training Loss: 0.021900802850723267, Test Loss: 0.1216064989566803\n", "Epoch 7742/10000, Training Loss: 0.021896373480558395, Test Loss: 0.12176274508237839\n", "Epoch 7743/10000, Training Loss: 0.02189195714890957, Test Loss: 0.12173990905284882\n", "Epoch 7744/10000, Training Loss: 0.021887557581067085, Test Loss: 0.12173082679510117\n", "Epoch 7745/10000, Training Loss: 0.021883204579353333, Test Loss: 0.12185972929000854\n", "Epoch 7746/10000, Training Loss: 0.02187880128622055, Test Loss: 0.12173819541931152\n", "Epoch 7747/10000, Training Loss: 0.021874412894248962, Test Loss: 0.12192470580339432\n", "Epoch 7748/10000, Training Loss: 0.021870065480470657, Test Loss: 0.12180482596158981\n", "Epoch 7749/10000, Training Loss: 0.02186567708849907, Test Loss: 0.1219378262758255\n", "Epoch 7750/10000, Training Loss: 0.021861247718334198, Test Loss: 0.12190868705511093\n", "Epoch 7751/10000, Training Loss: 0.021856877952814102, Test Loss: 0.12193309515714645\n", "Epoch 7752/10000, Training Loss: 0.021852485835552216, Test Loss: 0.12201046198606491\n", "Epoch 7753/10000, Training Loss: 0.021848132833838463, Test Loss: 0.1219477429986\n", "Epoch 7754/10000, Training Loss: 0.02184373326599598, Test Loss: 0.12208139151334763\n", "Epoch 7755/10000, Training Loss: 0.0218393262475729, Test Loss: 0.12199842184782028\n", "Epoch 7756/10000, Training Loss: 0.02183496206998825, Test Loss: 0.1221175491809845\n", "Epoch 7757/10000, Training Loss: 0.02183062955737114, Test Loss: 0.12207838892936707\n", "Epoch 7758/10000, Training Loss: 0.021826211363077164, Test Loss: 0.12213433533906937\n", "Epoch 7759/10000, Training Loss: 0.021821871399879456, Test Loss: 0.12216640263795853\n", "Epoch 7760/10000, Training Loss: 0.02181745320558548, Test Loss: 0.12215450406074524\n", "Epoch 7761/10000, Training Loss: 0.021813126280903816, Test Loss: 0.12224213778972626\n", "Epoch 7762/10000, Training Loss: 0.021808728575706482, Test Loss: 0.12219369411468506\n", "Epoch 7763/10000, Training Loss: 0.021804334595799446, Test Loss: 0.12229534238576889\n", "Epoch 7764/10000, Training Loss: 0.02179999276995659, Test Loss: 0.12225522100925446\n", "Epoch 7765/10000, Training Loss: 0.021795587614178658, Test Loss: 0.12233001738786697\n", "Epoch 7766/10000, Training Loss: 0.021791215986013412, Test Loss: 0.12233050167560577\n", "Epoch 7767/10000, Training Loss: 0.021786851808428764, Test Loss: 0.12235800176858902\n", "Epoch 7768/10000, Training Loss: 0.0217825286090374, Test Loss: 0.12240541726350784\n", "Epoch 7769/10000, Training Loss: 0.02177811972796917, Test Loss: 0.12239272147417068\n", "Epoch 7770/10000, Training Loss: 0.02177370712161064, Test Loss: 0.12246905267238617\n", "Epoch 7771/10000, Training Loss: 0.02176937647163868, Test Loss: 0.12244098633527756\n", "Epoch 7772/10000, Training Loss: 0.021765027195215225, Test Loss: 0.122519850730896\n", "Epoch 7773/10000, Training Loss: 0.02176065556704998, Test Loss: 0.12250117212533951\n", "Epoch 7774/10000, Training Loss: 0.021756289526820183, Test Loss: 0.12256086617708206\n", "Epoch 7775/10000, Training Loss: 0.02175191603600979, Test Loss: 0.12256836891174316\n", "Epoch 7776/10000, Training Loss: 0.021747546270489693, Test Loss: 0.12259849905967712\n", "Epoch 7777/10000, Training Loss: 0.021743159741163254, Test Loss: 0.12263543158769608\n", "Epoch 7778/10000, Training Loss: 0.021738817915320396, Test Loss: 0.12263983488082886\n", "Epoch 7779/10000, Training Loss: 0.021734431385993958, Test Loss: 0.12269696593284607\n", "Epoch 7780/10000, Training Loss: 0.021730052307248116, Test Loss: 0.12268801033496857\n", "Epoch 7781/10000, Training Loss: 0.02172570303082466, Test Loss: 0.12275122106075287\n", "Epoch 7782/10000, Training Loss: 0.021721314638853073, Test Loss: 0.12274349480867386\n", "Epoch 7783/10000, Training Loss: 0.02171698771417141, Test Loss: 0.12279976159334183\n", "Epoch 7784/10000, Training Loss: 0.021712608635425568, Test Loss: 0.12280379235744476\n", "Epoch 7785/10000, Training Loss: 0.021708274260163307, Test Loss: 0.12284485995769501\n", "Epoch 7786/10000, Training Loss: 0.02170391008257866, Test Loss: 0.12286596745252609\n", "Epoch 7787/10000, Training Loss: 0.021699508652091026, Test Loss: 0.1228899285197258\n", "Epoch 7788/10000, Training Loss: 0.021695183590054512, Test Loss: 0.12292716652154922\n", "Epoch 7789/10000, Training Loss: 0.021690769121050835, Test Loss: 0.12293706089258194\n", "Epoch 7790/10000, Training Loss: 0.02168644778430462, Test Loss: 0.12298566102981567\n", "Epoch 7791/10000, Training Loss: 0.021682066842913628, Test Loss: 0.12298760563135147\n", "Epoch 7792/10000, Training Loss: 0.02167770080268383, Test Loss: 0.12304075807332993\n", "Epoch 7793/10000, Training Loss: 0.021673360839486122, Test Loss: 0.12304246425628662\n", "Epoch 7794/10000, Training Loss: 0.021668963134288788, Test Loss: 0.12309151887893677\n", "Epoch 7795/10000, Training Loss: 0.021664638072252274, Test Loss: 0.1231008917093277\n", "Epoch 7796/10000, Training Loss: 0.021660251542925835, Test Loss: 0.12314047664403915\n", "Epoch 7797/10000, Training Loss: 0.021655894815921783, Test Loss: 0.12316016107797623\n", "Epoch 7798/10000, Training Loss: 0.021651534363627434, Test Loss: 0.12318964302539825\n", "Epoch 7799/10000, Training Loss: 0.021647177636623383, Test Loss: 0.12321878969669342\n", "Epoch 7800/10000, Training Loss: 0.021642832085490227, Test Loss: 0.12323959171772003\n", "Epoch 7801/10000, Training Loss: 0.02163855731487274, Test Loss: 0.12327699363231659\n", "Epoch 7802/10000, Training Loss: 0.021634168922901154, Test Loss: 0.12329044193029404\n", "Epoch 7803/10000, Training Loss: 0.021629827097058296, Test Loss: 0.12333400547504425\n", "Epoch 7804/10000, Training Loss: 0.02162541262805462, Test Loss: 0.12334321439266205\n", "Epoch 7805/10000, Training Loss: 0.021621111780405045, Test Loss: 0.12338954955339432\n", "Epoch 7806/10000, Training Loss: 0.021616723388433456, Test Loss: 0.12339749932289124\n", "Epoch 7807/10000, Training Loss: 0.021612374112010002, Test Loss: 0.12344356626272202\n", "Epoch 7808/10000, Training Loss: 0.021607985720038414, Test Loss: 0.12345333397388458\n", "Epoch 7809/10000, Training Loss: 0.021603653207421303, Test Loss: 0.1234966292977333\n", "Epoch 7810/10000, Training Loss: 0.02159932069480419, Test Loss: 0.12350999563932419\n", "Epoch 7811/10000, Training Loss: 0.02159496210515499, Test Loss: 0.12354952841997147\n", "Epoch 7812/10000, Training Loss: 0.02159062586724758, Test Loss: 0.12356669455766678\n", "Epoch 7813/10000, Training Loss: 0.021586239337921143, Test Loss: 0.12360268831253052\n", "Epoch 7814/10000, Training Loss: 0.02158190682530403, Test Loss: 0.12362369894981384\n", "Epoch 7815/10000, Training Loss: 0.021577563136816025, Test Loss: 0.12365542352199554\n", "Epoch 7816/10000, Training Loss: 0.02157321386039257, Test Loss: 0.12368108332157135\n", "Epoch 7817/10000, Training Loss: 0.02156885527074337, Test Loss: 0.12370841950178146\n", "Epoch 7818/10000, Training Loss: 0.021564483642578125, Test Loss: 0.12373846769332886\n", "Epoch 7819/10000, Training Loss: 0.021560179069638252, Test Loss: 0.12376184016466141\n", "Epoch 7820/10000, Training Loss: 0.02155580185353756, Test Loss: 0.12379538267850876\n", "Epoch 7821/10000, Training Loss: 0.021551450714468956, Test Loss: 0.12381566315889359\n", "Epoch 7822/10000, Training Loss: 0.021547114476561546, Test Loss: 0.12385194003582001\n", "Epoch 7823/10000, Training Loss: 0.021542739123106003, Test Loss: 0.12387030571699142\n", "Epoch 7824/10000, Training Loss: 0.02153843268752098, Test Loss: 0.12390831112861633\n", "Epoch 7825/10000, Training Loss: 0.021534090861678123, Test Loss: 0.12392476201057434\n", "Epoch 7826/10000, Training Loss: 0.021529708057641983, Test Loss: 0.12396495789289474\n", "Epoch 7827/10000, Training Loss: 0.02152535878121853, Test Loss: 0.12397956848144531\n", "Epoch 7828/10000, Training Loss: 0.021521037444472313, Test Loss: 0.12402153015136719\n", "Epoch 7829/10000, Training Loss: 0.021516678854823112, Test Loss: 0.12403429299592972\n", "Epoch 7830/10000, Training Loss: 0.02151235193014145, Test Loss: 0.1240788921713829\n", "Epoch 7831/10000, Training Loss: 0.021508028730750084, Test Loss: 0.12408857047557831\n", "Epoch 7832/10000, Training Loss: 0.02150365337729454, Test Loss: 0.12413688749074936\n", "Epoch 7833/10000, Training Loss: 0.021499324589967728, Test Loss: 0.12414197623729706\n", "Epoch 7834/10000, Training Loss: 0.021495014429092407, Test Loss: 0.1241958886384964\n", "Epoch 7835/10000, Training Loss: 0.02149062417447567, Test Loss: 0.12419441342353821\n", "Epoch 7836/10000, Training Loss: 0.021486330777406693, Test Loss: 0.12425661832094193\n", "Epoch 7837/10000, Training Loss: 0.021481961011886597, Test Loss: 0.12424541264772415\n", "Epoch 7838/10000, Training Loss: 0.021477626636624336, Test Loss: 0.12431922554969788\n", "Epoch 7839/10000, Training Loss: 0.02147328294813633, Test Loss: 0.12429392337799072\n", "Epoch 7840/10000, Training Loss: 0.02146892435848713, Test Loss: 0.1243852749466896\n", "Epoch 7841/10000, Training Loss: 0.021464621648192406, Test Loss: 0.1243380606174469\n", "Epoch 7842/10000, Training Loss: 0.0214602779597044, Test Loss: 0.12445703893899918\n", "Epoch 7843/10000, Training Loss: 0.02145594358444214, Test Loss: 0.12437596917152405\n", "Epoch 7844/10000, Training Loss: 0.021451599895954132, Test Loss: 0.12453696876764297\n", "Epoch 7845/10000, Training Loss: 0.021447259932756424, Test Loss: 0.12440332770347595\n", "Epoch 7846/10000, Training Loss: 0.02144293673336506, Test Loss: 0.12463007867336273\n", "Epoch 7847/10000, Training Loss: 0.021438661962747574, Test Loss: 0.1244143396615982\n", "Epoch 7848/10000, Training Loss: 0.021434349939227104, Test Loss: 0.12474437803030014\n", "Epoch 7849/10000, Training Loss: 0.021430039778351784, Test Loss: 0.12439834326505661\n", "Epoch 7850/10000, Training Loss: 0.02142578363418579, Test Loss: 0.12489313632249832\n", "Epoch 7851/10000, Training Loss: 0.02142164669930935, Test Loss: 0.12433825433254242\n", "Epoch 7852/10000, Training Loss: 0.021417515352368355, Test Loss: 0.1250961720943451\n", "Epoch 7853/10000, Training Loss: 0.021413523703813553, Test Loss: 0.12421096861362457\n", "Epoch 7854/10000, Training Loss: 0.02140977792441845, Test Loss: 0.1253766566514969\n", "Epoch 7855/10000, Training Loss: 0.02140624448657036, Test Loss: 0.12399867922067642\n", "Epoch 7856/10000, Training Loss: 0.021403169259428978, Test Loss: 0.1257294863462448\n", "Epoch 7857/10000, Training Loss: 0.02140040323138237, Test Loss: 0.12367092072963715\n", "Epoch 7858/10000, Training Loss: 0.021399395540356636, Test Loss: 0.12629668414592743\n", "Epoch 7859/10000, Training Loss: 0.021397795528173447, Test Loss: 0.12347174435853958\n", "Epoch 7860/10000, Training Loss: 0.021396392956376076, Test Loss: 0.12650591135025024\n", "Epoch 7861/10000, Training Loss: 0.021389706060290337, Test Loss: 0.12416272610425949\n", "Epoch 7862/10000, Training Loss: 0.021379785612225533, Test Loss: 0.12561851739883423\n", "Epoch 7863/10000, Training Loss: 0.021370304748415947, Test Loss: 0.12561459839344025\n", "Epoch 7864/10000, Training Loss: 0.021365681663155556, Test Loss: 0.12454269081354141\n", "Epoch 7865/10000, Training Loss: 0.02136499620974064, Test Loss: 0.1264779418706894\n", "Epoch 7866/10000, Training Loss: 0.0213631521910429, Test Loss: 0.12453895807266235\n", "Epoch 7867/10000, Training Loss: 0.021357672289013863, Test Loss: 0.12605413794517517\n", "Epoch 7868/10000, Training Loss: 0.021349754184484482, Test Loss: 0.12561970949172974\n", "Epoch 7869/10000, Training Loss: 0.021343914791941643, Test Loss: 0.12510496377944946\n", "Epoch 7870/10000, Training Loss: 0.02134125493466854, Test Loss: 0.12644720077514648\n", "Epoch 7871/10000, Training Loss: 0.021338945254683495, Test Loss: 0.12497851252555847\n", "Epoch 7872/10000, Training Loss: 0.0213343296200037, Test Loss: 0.12617431581020355\n", "Epoch 7873/10000, Training Loss: 0.02132793888449669, Test Loss: 0.12581320106983185\n", "Epoch 7874/10000, Training Loss: 0.02132265269756317, Test Loss: 0.12542353570461273\n", "Epoch 7875/10000, Training Loss: 0.021319398656487465, Test Loss: 0.12646761536598206\n", "Epoch 7876/10000, Training Loss: 0.021316219121217728, Test Loss: 0.1253475546836853\n", "Epoch 7877/10000, Training Loss: 0.021311672404408455, Test Loss: 0.1262294501066208\n", "Epoch 7878/10000, Training Loss: 0.021306149661540985, Test Loss: 0.12602315843105316\n", "Epoch 7879/10000, Training Loss: 0.02130143716931343, Test Loss: 0.12565617263317108\n", "Epoch 7880/10000, Training Loss: 0.021297860890626907, Test Loss: 0.12650538980960846\n", "Epoch 7881/10000, Training Loss: 0.021294189617037773, Test Loss: 0.1256554275751114\n", "Epoch 7882/10000, Training Loss: 0.02128964476287365, Test Loss: 0.12628333270549774\n", "Epoch 7883/10000, Training Loss: 0.021284623071551323, Test Loss: 0.12620680034160614\n", "Epoch 7884/10000, Training Loss: 0.021280229091644287, Test Loss: 0.1258591264486313\n", "Epoch 7885/10000, Training Loss: 0.021276438608765602, Test Loss: 0.12655220925807953\n", "Epoch 7886/10000, Training Loss: 0.021272452548146248, Test Loss: 0.12591375410556793\n", "Epoch 7887/10000, Training Loss: 0.0212679672986269, Test Loss: 0.1263541579246521\n", "Epoch 7888/10000, Training Loss: 0.021263282746076584, Test Loss: 0.126360222697258\n", "Epoch 7889/10000, Training Loss: 0.021259041503071785, Test Loss: 0.1260494440793991\n", "Epoch 7890/10000, Training Loss: 0.021255074068903923, Test Loss: 0.12660808861255646\n", "Epoch 7891/10000, Training Loss: 0.021251006051898003, Test Loss: 0.1261330246925354\n", "Epoch 7892/10000, Training Loss: 0.02124655619263649, Test Loss: 0.1264447718858719\n", "Epoch 7893/10000, Training Loss: 0.021242059767246246, Test Loss: 0.12649182975292206\n", "Epoch 7894/10000, Training Loss: 0.021237829700112343, Test Loss: 0.1262301206588745\n", "Epoch 7895/10000, Training Loss: 0.02123376354575157, Test Loss: 0.12667663395404816\n", "Epoch 7896/10000, Training Loss: 0.021229594945907593, Test Loss: 0.12632183730602264\n", "Epoch 7897/10000, Training Loss: 0.02122521586716175, Test Loss: 0.1265515685081482\n", "Epoch 7898/10000, Training Loss: 0.021220887079834938, Test Loss: 0.12661053240299225\n", "Epoch 7899/10000, Training Loss: 0.021216638386249542, Test Loss: 0.1264023631811142\n", "Epoch 7900/10000, Training Loss: 0.021212492138147354, Test Loss: 0.12675723433494568\n", "Epoch 7901/10000, Training Loss: 0.021208329126238823, Test Loss: 0.12648995220661163\n", "Epoch 7902/10000, Training Loss: 0.02120399661362171, Test Loss: 0.1266695111989975\n", "Epoch 7903/10000, Training Loss: 0.02119968831539154, Test Loss: 0.12672363221645355\n", "Epoch 7904/10000, Training Loss: 0.021195439621806145, Test Loss: 0.12656712532043457\n", "Epoch 7905/10000, Training Loss: 0.021191300824284554, Test Loss: 0.12684834003448486\n", "Epoch 7906/10000, Training Loss: 0.021187111735343933, Test Loss: 0.12664371728897095\n", "Epoch 7907/10000, Training Loss: 0.02118280529975891, Test Loss: 0.12679532170295715\n", "Epoch 7908/10000, Training Loss: 0.021178532391786575, Test Loss: 0.12683473527431488\n", "Epoch 7909/10000, Training Loss: 0.021174270659685135, Test Loss: 0.1267251968383789\n", "Epoch 7910/10000, Training Loss: 0.02117006853222847, Test Loss: 0.12694843113422394\n", "Epoch 7911/10000, Training Loss: 0.021165871992707253, Test Loss: 0.12672105431556702\n", "Epoch 7912/10000, Training Loss: 0.021161776036024094, Test Loss: 0.12713725864887238\n", "Epoch 7913/10000, Training Loss: 0.02115754224359989, Test Loss: 0.1268785297870636\n", "Epoch 7914/10000, Training Loss: 0.02115325629711151, Test Loss: 0.12708237767219543\n", "Epoch 7915/10000, Training Loss: 0.02114894986152649, Test Loss: 0.127164825797081\n", "Epoch 7916/10000, Training Loss: 0.021144744008779526, Test Loss: 0.12701191008090973\n", "Epoch 7917/10000, Training Loss: 0.021140577271580696, Test Loss: 0.12732891738414764\n", "Epoch 7918/10000, Training Loss: 0.02113635092973709, Test Loss: 0.12711645662784576\n", "Epoch 7919/10000, Training Loss: 0.02113211713731289, Test Loss: 0.12730559706687927\n", "Epoch 7920/10000, Training Loss: 0.021127823740243912, Test Loss: 0.12733612954616547\n", "Epoch 7921/10000, Training Loss: 0.02112356573343277, Test Loss: 0.1272491067647934\n", "Epoch 7922/10000, Training Loss: 0.021119417622685432, Test Loss: 0.12748579680919647\n", "Epoch 7923/10000, Training Loss: 0.02111523225903511, Test Loss: 0.12731540203094482\n", "Epoch 7924/10000, Training Loss: 0.021110955625772476, Test Loss: 0.1274915188550949\n", "Epoch 7925/10000, Training Loss: 0.021106671541929245, Test Loss: 0.1274840235710144\n", "Epoch 7926/10000, Training Loss: 0.021102435886859894, Test Loss: 0.12745265662670135\n", "Epoch 7927/10000, Training Loss: 0.02109820768237114, Test Loss: 0.12762169539928436\n", "Epoch 7928/10000, Training Loss: 0.02109406888484955, Test Loss: 0.12749138474464417\n", "Epoch 7929/10000, Training Loss: 0.021089818328619003, Test Loss: 0.12765491008758545\n", "Epoch 7930/10000, Training Loss: 0.021085571497678757, Test Loss: 0.12755103409290314\n", "Epoch 7931/10000, Training Loss: 0.021081378683447838, Test Loss: 0.12784656882286072\n", "Epoch 7932/10000, Training Loss: 0.021077213808894157, Test Loss: 0.127666637301445\n", "Epoch 7933/10000, Training Loss: 0.02107297256588936, Test Loss: 0.12787604331970215\n", "Epoch 7934/10000, Training Loss: 0.02106870524585247, Test Loss: 0.12788137793540955\n", "Epoch 7935/10000, Training Loss: 0.021064521744847298, Test Loss: 0.1278538703918457\n", "Epoch 7936/10000, Training Loss: 0.02106025628745556, Test Loss: 0.12805509567260742\n", "Epoch 7937/10000, Training Loss: 0.02105608955025673, Test Loss: 0.12791140377521515\n", "Epoch 7938/10000, Training Loss: 0.02105187252163887, Test Loss: 0.12810999155044556\n", "Epoch 7939/10000, Training Loss: 0.02104760892689228, Test Loss: 0.12806297838687897\n", "Epoch 7940/10000, Training Loss: 0.021043360233306885, Test Loss: 0.12809798121452332\n", "Epoch 7941/10000, Training Loss: 0.021039191633462906, Test Loss: 0.1282176673412323\n", "Epoch 7942/10000, Training Loss: 0.021034931764006615, Test Loss: 0.1281217336654663\n", "Epoch 7943/10000, Training Loss: 0.021030746400356293, Test Loss: 0.12829545140266418\n", "Epoch 7944/10000, Training Loss: 0.02102649211883545, Test Loss: 0.12822280824184418\n", "Epoch 7945/10000, Training Loss: 0.021022317931056023, Test Loss: 0.12830409407615662\n", "Epoch 7946/10000, Training Loss: 0.021018056198954582, Test Loss: 0.12835460901260376\n", "Epoch 7947/10000, Training Loss: 0.021013833582401276, Test Loss: 0.1283133327960968\n", "Epoch 7948/10000, Training Loss: 0.021009640768170357, Test Loss: 0.12844833731651306\n", "Epoch 7949/10000, Training Loss: 0.021005403250455856, Test Loss: 0.12837563455104828\n", "Epoch 7950/10000, Training Loss: 0.021001188084483147, Test Loss: 0.12848281860351562\n", "Epoch 7951/10000, Training Loss: 0.020996998995542526, Test Loss: 0.1284153312444687\n", "Epoch 7952/10000, Training Loss: 0.020992780104279518, Test Loss: 0.12870587408542633\n", "Epoch 7953/10000, Training Loss: 0.02098863571882248, Test Loss: 0.12848806381225586\n", "Epoch 7954/10000, Training Loss: 0.02098444662988186, Test Loss: 0.12877479195594788\n", "Epoch 7955/10000, Training Loss: 0.02098015509545803, Test Loss: 0.12867861986160278\n", "Epoch 7956/10000, Training Loss: 0.02097591571509838, Test Loss: 0.12875090539455414\n", "Epoch 7957/10000, Training Loss: 0.020971670746803284, Test Loss: 0.12888261675834656\n", "Epoch 7958/10000, Training Loss: 0.020967496559023857, Test Loss: 0.12875674664974213\n", "Epoch 7959/10000, Training Loss: 0.020963305607438087, Test Loss: 0.128997802734375\n", "Epoch 7960/10000, Training Loss: 0.020959123969078064, Test Loss: 0.12885752320289612\n", "Epoch 7961/10000, Training Loss: 0.02095489576458931, Test Loss: 0.12901458144187927\n", "Epoch 7962/10000, Training Loss: 0.02095061168074608, Test Loss: 0.12901639938354492\n", "Epoch 7963/10000, Training Loss: 0.020946376025676727, Test Loss: 0.12900400161743164\n", "Epoch 7964/10000, Training Loss: 0.020942235365509987, Test Loss: 0.12915286421775818\n", "Epoch 7965/10000, Training Loss: 0.020938029512763023, Test Loss: 0.12904050946235657\n", "Epoch 7966/10000, Training Loss: 0.020933810621500015, Test Loss: 0.129217728972435\n", "Epoch 7967/10000, Training Loss: 0.020929571241140366, Test Loss: 0.12914246320724487\n", "Epoch 7968/10000, Training Loss: 0.02092534862458706, Test Loss: 0.12922805547714233\n", "Epoch 7969/10000, Training Loss: 0.0209211278706789, Test Loss: 0.12920384109020233\n", "Epoch 7970/10000, Training Loss: 0.020916923880577087, Test Loss: 0.129449263215065\n", "Epoch 7971/10000, Training Loss: 0.02091277204453945, Test Loss: 0.12926335632801056\n", "Epoch 7972/10000, Training Loss: 0.020908573642373085, Test Loss: 0.12955068051815033\n", "Epoch 7973/10000, Training Loss: 0.020904336124658585, Test Loss: 0.12941884994506836\n", "Epoch 7974/10000, Training Loss: 0.020900076255202293, Test Loss: 0.12955942749977112\n", "Epoch 7975/10000, Training Loss: 0.02089584432542324, Test Loss: 0.12960940599441528\n", "Epoch 7976/10000, Training Loss: 0.020891649648547173, Test Loss: 0.12956015765666962\n", "Epoch 7977/10000, Training Loss: 0.020887499675154686, Test Loss: 0.12975426018238068\n", "Epoch 7978/10000, Training Loss: 0.02088327519595623, Test Loss: 0.12961886823177338\n", "Epoch 7979/10000, Training Loss: 0.020879089832305908, Test Loss: 0.1298190802335739\n", "Epoch 7980/10000, Training Loss: 0.020874880254268646, Test Loss: 0.12974044680595398\n", "Epoch 7981/10000, Training Loss: 0.020870592445135117, Test Loss: 0.12983064353466034\n", "Epoch 7982/10000, Training Loss: 0.020866407081484795, Test Loss: 0.12988030910491943\n", "Epoch 7983/10000, Training Loss: 0.020862162113189697, Test Loss: 0.12984339892864227\n", "Epoch 7984/10000, Training Loss: 0.020858019590377808, Test Loss: 0.12998858094215393\n", "Epoch 7985/10000, Training Loss: 0.020853793248534203, Test Loss: 0.12989649176597595\n", "Epoch 7986/10000, Training Loss: 0.02084958925843239, Test Loss: 0.13004562258720398\n", "Epoch 7987/10000, Training Loss: 0.020845413208007812, Test Loss: 0.12992902100086212\n", "Epoch 7988/10000, Training Loss: 0.020841289311647415, Test Loss: 0.13027745485305786\n", "Epoch 7989/10000, Training Loss: 0.02083718776702881, Test Loss: 0.12998928129673004\n", "Epoch 7990/10000, Training Loss: 0.02083294466137886, Test Loss: 0.13036927580833435\n", "Epoch 7991/10000, Training Loss: 0.020828725770115852, Test Loss: 0.13016052544116974\n", "Epoch 7992/10000, Training Loss: 0.02082442119717598, Test Loss: 0.13035798072814941\n", "Epoch 7993/10000, Training Loss: 0.02082013711333275, Test Loss: 0.13037601113319397\n", "Epoch 7994/10000, Training Loss: 0.020815959200263023, Test Loss: 0.13033103942871094\n", "Epoch 7995/10000, Training Loss: 0.020811740309000015, Test Loss: 0.13055063784122467\n", "Epoch 7996/10000, Training Loss: 0.020807573571801186, Test Loss: 0.13036198914051056\n", "Epoch 7997/10000, Training Loss: 0.02080339379608631, Test Loss: 0.13063935935497284\n", "Epoch 7998/10000, Training Loss: 0.0207991786301136, Test Loss: 0.1304694563150406\n", "Epoch 7999/10000, Training Loss: 0.020794952288269997, Test Loss: 0.13065436482429504\n", "Epoch 8000/10000, Training Loss: 0.020790724083781242, Test Loss: 0.13061963021755219\n", "Epoch 8001/10000, Training Loss: 0.02078651450574398, Test Loss: 0.13064590096473694\n", "Epoch 8002/10000, Training Loss: 0.02078227885067463, Test Loss: 0.13076013326644897\n", "Epoch 8003/10000, Training Loss: 0.020778141915798187, Test Loss: 0.1306626945734024\n", "Epoch 8004/10000, Training Loss: 0.020773956552147865, Test Loss: 0.1308545619249344\n", "Epoch 8005/10000, Training Loss: 0.020769767463207245, Test Loss: 0.13066551089286804\n", "Epoch 8006/10000, Training Loss: 0.020765604451298714, Test Loss: 0.13110598921775818\n", "Epoch 8007/10000, Training Loss: 0.02076157182455063, Test Loss: 0.1307085007429123\n", "Epoch 8008/10000, Training Loss: 0.020757408812642097, Test Loss: 0.13120928406715393\n", "Epoch 8009/10000, Training Loss: 0.02075313962996006, Test Loss: 0.13087289035320282\n", "Epoch 8010/10000, Training Loss: 0.02074890211224556, Test Loss: 0.13119575381278992\n", "Epoch 8011/10000, Training Loss: 0.020744573324918747, Test Loss: 0.1311013549566269\n", "Epoch 8012/10000, Training Loss: 0.020740291103720665, Test Loss: 0.13114304840564728\n", "Epoch 8013/10000, Training Loss: 0.02073611505329609, Test Loss: 0.1313135176897049\n", "Epoch 8014/10000, Training Loss: 0.020731955766677856, Test Loss: 0.13112713396549225\n", "Epoch 8015/10000, Training Loss: 0.020727811381220818, Test Loss: 0.13145315647125244\n", "Epoch 8016/10000, Training Loss: 0.020723648369312286, Test Loss: 0.13118627667427063\n", "Epoch 8017/10000, Training Loss: 0.020719444379210472, Test Loss: 0.13150854408740997\n", "Epoch 8018/10000, Training Loss: 0.020715204998850822, Test Loss: 0.1313117891550064\n", "Epoch 8019/10000, Training Loss: 0.020710976794362068, Test Loss: 0.13150663673877716\n", "Epoch 8020/10000, Training Loss: 0.02070671319961548, Test Loss: 0.13146725296974182\n", "Epoch 8021/10000, Training Loss: 0.02070254273712635, Test Loss: 0.13149096071720123\n", "Epoch 8022/10000, Training Loss: 0.020698292180895805, Test Loss: 0.13161000609397888\n", "Epoch 8023/10000, Training Loss: 0.02069413848221302, Test Loss: 0.13143950700759888\n", "Epoch 8024/10000, Training Loss: 0.020690100267529488, Test Loss: 0.13191844522953033\n", "Epoch 8025/10000, Training Loss: 0.020686130970716476, Test Loss: 0.13136045634746552\n", "Epoch 8026/10000, Training Loss: 0.020682310685515404, Test Loss: 0.13228486478328705\n", "Epoch 8027/10000, Training Loss: 0.020678553730249405, Test Loss: 0.13140463829040527\n", "Epoch 8028/10000, Training Loss: 0.020674481987953186, Test Loss: 0.1324060708284378\n", "Epoch 8029/10000, Training Loss: 0.020670104771852493, Test Loss: 0.13165396451950073\n", "Epoch 8030/10000, Training Loss: 0.020665492862462997, Test Loss: 0.13232336938381195\n", "Epoch 8031/10000, Training Loss: 0.020660927519202232, Test Loss: 0.13202528655529022\n", "Epoch 8032/10000, Training Loss: 0.020656423643231392, Test Loss: 0.13216009736061096\n", "Epoch 8033/10000, Training Loss: 0.020652256906032562, Test Loss: 0.13238348066806793\n", "Epoch 8034/10000, Training Loss: 0.020648149773478508, Test Loss: 0.1320481300354004\n", "Epoch 8035/10000, Training Loss: 0.020644117146730423, Test Loss: 0.132624551653862\n", "Epoch 8036/10000, Training Loss: 0.02064010687172413, Test Loss: 0.13206422328948975\n", "Epoch 8037/10000, Training Loss: 0.020636020228266716, Test Loss: 0.13271300494670868\n", "Epoch 8038/10000, Training Loss: 0.020631704479455948, Test Loss: 0.13221417367458344\n", "Epoch 8039/10000, Training Loss: 0.020627353340387344, Test Loss: 0.13267619907855988\n", "Epoch 8040/10000, Training Loss: 0.020623069256544113, Test Loss: 0.1324469894170761\n", "Epoch 8041/10000, Training Loss: 0.020618736743927002, Test Loss: 0.13258589804172516\n", "Epoch 8042/10000, Training Loss: 0.020614512264728546, Test Loss: 0.1326831877231598\n", "Epoch 8043/10000, Training Loss: 0.02061033807694912, Test Loss: 0.1325192004442215\n", "Epoch 8044/10000, Training Loss: 0.02060622349381447, Test Loss: 0.13286054134368896\n", "Epoch 8045/10000, Training Loss: 0.020602071657776833, Test Loss: 0.13252370059490204\n", "Epoch 8046/10000, Training Loss: 0.02059798873960972, Test Loss: 0.13295193016529083\n", "Epoch 8047/10000, Training Loss: 0.02059382200241089, Test Loss: 0.13260871171951294\n", "Epoch 8048/10000, Training Loss: 0.02058953233063221, Test Loss: 0.13296741247177124\n", "Epoch 8049/10000, Training Loss: 0.020585281774401665, Test Loss: 0.13275058567523956\n", "Epoch 8050/10000, Training Loss: 0.020581044256687164, Test Loss: 0.1329413503408432\n", "Epoch 8051/10000, Training Loss: 0.020576827228069305, Test Loss: 0.13291072845458984\n", "Epoch 8052/10000, Training Loss: 0.02057262696325779, Test Loss: 0.1329127699136734\n", "Epoch 8053/10000, Training Loss: 0.02056846395134926, Test Loss: 0.13305352628231049\n", "Epoch 8054/10000, Training Loss: 0.020564302802085876, Test Loss: 0.1329118013381958\n", "Epoch 8055/10000, Training Loss: 0.020560171455144882, Test Loss: 0.13315868377685547\n", "Epoch 8056/10000, Training Loss: 0.020555956289172173, Test Loss: 0.13295024633407593\n", "Epoch 8057/10000, Training Loss: 0.02055174484848976, Test Loss: 0.13322284817695618\n", "Epoch 8058/10000, Training Loss: 0.020547635853290558, Test Loss: 0.1329692155122757\n", "Epoch 8059/10000, Training Loss: 0.02054348587989807, Test Loss: 0.13345053791999817\n", "Epoch 8060/10000, Training Loss: 0.020539497956633568, Test Loss: 0.13293667137622833\n", "Epoch 8061/10000, Training Loss: 0.020535623654723167, Test Loss: 0.13377895951271057\n", "Epoch 8062/10000, Training Loss: 0.0205317884683609, Test Loss: 0.13296161592006683\n", "Epoch 8063/10000, Training Loss: 0.020527774468064308, Test Loss: 0.13394324481487274\n", "Epoch 8064/10000, Training Loss: 0.020523598417639732, Test Loss: 0.1331205815076828\n", "Epoch 8065/10000, Training Loss: 0.020519234240055084, Test Loss: 0.13395516574382782\n", "Epoch 8066/10000, Training Loss: 0.020514793694019318, Test Loss: 0.13338689506053925\n", "Epoch 8067/10000, Training Loss: 0.020510291680693626, Test Loss: 0.13386748731136322\n", "Epoch 8068/10000, Training Loss: 0.020505866035819054, Test Loss: 0.1336953192949295\n", "Epoch 8069/10000, Training Loss: 0.020501574501395226, Test Loss: 0.13375736773014069\n", "Epoch 8070/10000, Training Loss: 0.020497426390647888, Test Loss: 0.13397283852100372\n", "Epoch 8071/10000, Training Loss: 0.02049330249428749, Test Loss: 0.13369013369083405\n", "Epoch 8072/10000, Training Loss: 0.02048918791115284, Test Loss: 0.13417337834835052\n", "Epoch 8073/10000, Training Loss: 0.02048519253730774, Test Loss: 0.13370013236999512\n", "Epoch 8074/10000, Training Loss: 0.020481057465076447, Test Loss: 0.1342817097902298\n", "Epoch 8075/10000, Training Loss: 0.020476900041103363, Test Loss: 0.13379058241844177\n", "Epoch 8076/10000, Training Loss: 0.020472656935453415, Test Loss: 0.13430890440940857\n", "Epoch 8077/10000, Training Loss: 0.020468447357416153, Test Loss: 0.13394223153591156\n", "Epoch 8078/10000, Training Loss: 0.02046416699886322, Test Loss: 0.13428397476673126\n", "Epoch 8079/10000, Training Loss: 0.020459895953536034, Test Loss: 0.13412143290042877\n", "Epoch 8080/10000, Training Loss: 0.020455636084079742, Test Loss: 0.13424363732337952\n", "Epoch 8081/10000, Training Loss: 0.020451495423913002, Test Loss: 0.13429510593414307\n", "Epoch 8082/10000, Training Loss: 0.020447291433811188, Test Loss: 0.13421748578548431\n", "Epoch 8083/10000, Training Loss: 0.020443087443709373, Test Loss: 0.13443927466869354\n", "Epoch 8084/10000, Training Loss: 0.020438946783542633, Test Loss: 0.13422353565692902\n", "Epoch 8085/10000, Training Loss: 0.020434850826859474, Test Loss: 0.13453400135040283\n", "Epoch 8086/10000, Training Loss: 0.02043071761727333, Test Loss: 0.1342659443616867\n", "Epoch 8087/10000, Training Loss: 0.02042650431394577, Test Loss: 0.13453729450702667\n", "Epoch 8088/10000, Training Loss: 0.020422426983714104, Test Loss: 0.13423478603363037\n", "Epoch 8089/10000, Training Loss: 0.020418362691998482, Test Loss: 0.13465175032615662\n", "Epoch 8090/10000, Training Loss: 0.020414331927895546, Test Loss: 0.13412737846374512\n", "Epoch 8091/10000, Training Loss: 0.02041056752204895, Test Loss: 0.13483993709087372\n", "Epoch 8092/10000, Training Loss: 0.02040676213800907, Test Loss: 0.13404610753059387\n", "Epoch 8093/10000, Training Loss: 0.02040291577577591, Test Loss: 0.13492178916931152\n", "Epoch 8094/10000, Training Loss: 0.0203988179564476, Test Loss: 0.134064719080925\n", "Epoch 8095/10000, Training Loss: 0.020394548773765564, Test Loss: 0.13489331305027008\n", "Epoch 8096/10000, Training Loss: 0.020390158519148827, Test Loss: 0.1341795027256012\n", "Epoch 8097/10000, Training Loss: 0.020385656505823135, Test Loss: 0.13477323949337006\n", "Epoch 8098/10000, Training Loss: 0.020381087437272072, Test Loss: 0.13435818254947662\n", "Epoch 8099/10000, Training Loss: 0.020376674830913544, Test Loss: 0.13460849225521088\n", "Epoch 8100/10000, Training Loss: 0.020372292026877403, Test Loss: 0.13454574346542358\n", "Epoch 8101/10000, Training Loss: 0.020368080586194992, Test Loss: 0.13445410132408142\n", "Epoch 8102/10000, Training Loss: 0.020363982766866684, Test Loss: 0.1346951127052307\n", "Epoch 8103/10000, Training Loss: 0.020359979942440987, Test Loss: 0.13434873521327972\n", "Epoch 8104/10000, Training Loss: 0.020355911925435066, Test Loss: 0.1347808539867401\n", "Epoch 8105/10000, Training Loss: 0.020351843908429146, Test Loss: 0.13430801033973694\n", "Epoch 8106/10000, Training Loss: 0.020347805693745613, Test Loss: 0.1347985863685608\n", "Epoch 8107/10000, Training Loss: 0.020343581214547157, Test Loss: 0.13432934880256653\n", "Epoch 8108/10000, Training Loss: 0.020339375361800194, Test Loss: 0.13475856184959412\n", "Epoch 8109/10000, Training Loss: 0.02033509500324726, Test Loss: 0.13439686596393585\n", "Epoch 8110/10000, Training Loss: 0.020330827683210373, Test Loss: 0.1346818059682846\n", "Epoch 8111/10000, Training Loss: 0.020326586440205574, Test Loss: 0.13448618352413177\n", "Epoch 8112/10000, Training Loss: 0.020322341471910477, Test Loss: 0.13459384441375732\n", "Epoch 8113/10000, Training Loss: 0.02031818777322769, Test Loss: 0.1345738023519516\n", "Epoch 8114/10000, Training Loss: 0.02031398005783558, Test Loss: 0.1345157027244568\n", "Epoch 8115/10000, Training Loss: 0.02030986361205578, Test Loss: 0.1346425563097\n", "Epoch 8116/10000, Training Loss: 0.020305704325437546, Test Loss: 0.13440954685211182\n", "Epoch 8117/10000, Training Loss: 0.020301735028624535, Test Loss: 0.13481317460536957\n", "Epoch 8118/10000, Training Loss: 0.020297879353165627, Test Loss: 0.13425227999687195\n", "Epoch 8119/10000, Training Loss: 0.020294129848480225, Test Loss: 0.1350404918193817\n", "Epoch 8120/10000, Training Loss: 0.020290544256567955, Test Loss: 0.13413558900356293\n", "Epoch 8121/10000, Training Loss: 0.020286882296204567, Test Loss: 0.13514770567417145\n", "Epoch 8122/10000, Training Loss: 0.020282885059714317, Test Loss: 0.1341383159160614\n", "Epoch 8123/10000, Training Loss: 0.020278679206967354, Test Loss: 0.13511940836906433\n", "Epoch 8124/10000, Training Loss: 0.02027420699596405, Test Loss: 0.13426823914051056\n", "Epoch 8125/10000, Training Loss: 0.02026960998773575, Test Loss: 0.13496874272823334\n", "Epoch 8126/10000, Training Loss: 0.020264869555830956, Test Loss: 0.13448788225650787\n", "Epoch 8127/10000, Training Loss: 0.020260334014892578, Test Loss: 0.13475659489631653\n", "Epoch 8128/10000, Training Loss: 0.0202559232711792, Test Loss: 0.13472211360931396\n", "Epoch 8129/10000, Training Loss: 0.020251726731657982, Test Loss: 0.1345614343881607\n", "Epoch 8130/10000, Training Loss: 0.020247722044587135, Test Loss: 0.13490115106105804\n", "Epoch 8131/10000, Training Loss: 0.020243791863322258, Test Loss: 0.1344383955001831\n", "Epoch 8132/10000, Training Loss: 0.020239844918251038, Test Loss: 0.13499006628990173\n", "Epoch 8133/10000, Training Loss: 0.02023581974208355, Test Loss: 0.13440851867198944\n", "Epoch 8134/10000, Training Loss: 0.02023172378540039, Test Loss: 0.1349828541278839\n", "Epoch 8135/10000, Training Loss: 0.020227506756782532, Test Loss: 0.1344660371541977\n", "Epoch 8136/10000, Training Loss: 0.020223252475261688, Test Loss: 0.13489945232868195\n", "Epoch 8137/10000, Training Loss: 0.02021888457238674, Test Loss: 0.1345796138048172\n", "Epoch 8138/10000, Training Loss: 0.020214542746543884, Test Loss: 0.1347787231206894\n", "Epoch 8139/10000, Training Loss: 0.02021031267940998, Test Loss: 0.13470670580863953\n", "Epoch 8140/10000, Training Loss: 0.020206153392791748, Test Loss: 0.13466249406337738\n", "Epoch 8141/10000, Training Loss: 0.02020205557346344, Test Loss: 0.13481080532073975\n", "Epoch 8142/10000, Training Loss: 0.02019795961678028, Test Loss: 0.13453160226345062\n", "Epoch 8143/10000, Training Loss: 0.020193997770547867, Test Loss: 0.1349921077489853\n", "Epoch 8144/10000, Training Loss: 0.020190123468637466, Test Loss: 0.1343788206577301\n", "Epoch 8145/10000, Training Loss: 0.020186539739370346, Test Loss: 0.13519561290740967\n", "Epoch 8146/10000, Training Loss: 0.020182844251394272, Test Loss: 0.13425272703170776\n", "Epoch 8147/10000, Training Loss: 0.02017940580844879, Test Loss: 0.13536442816257477\n", "Epoch 8148/10000, Training Loss: 0.02017560973763466, Test Loss: 0.13425682485103607\n", "Epoch 8149/10000, Training Loss: 0.020171374082565308, Test Loss: 0.13532502949237823\n", "Epoch 8150/10000, Training Loss: 0.020166749134659767, Test Loss: 0.13445156812667847\n", "Epoch 8151/10000, Training Loss: 0.020161842927336693, Test Loss: 0.1351115107536316\n", "Epoch 8152/10000, Training Loss: 0.02015705034136772, Test Loss: 0.13475923240184784\n", "Epoch 8153/10000, Training Loss: 0.020152507349848747, Test Loss: 0.13483674824237823\n", "Epoch 8154/10000, Training Loss: 0.02014830894768238, Test Loss: 0.13505008816719055\n", "Epoch 8155/10000, Training Loss: 0.020144397392868996, Test Loss: 0.13462571799755096\n", "Epoch 8156/10000, Training Loss: 0.02014056034386158, Test Loss: 0.13522358238697052\n", "Epoch 8157/10000, Training Loss: 0.02013678476214409, Test Loss: 0.134552463889122\n", "Epoch 8158/10000, Training Loss: 0.02013273723423481, Test Loss: 0.13524164259433746\n", "Epoch 8159/10000, Training Loss: 0.020128531381487846, Test Loss: 0.13462601602077484\n", "Epoch 8160/10000, Training Loss: 0.02012414112687111, Test Loss: 0.13512763381004333\n", "Epoch 8161/10000, Training Loss: 0.020119722932577133, Test Loss: 0.1347963660955429\n", "Epoch 8162/10000, Training Loss: 0.020115390419960022, Test Loss: 0.13495232164859772\n", "Epoch 8163/10000, Training Loss: 0.020111167803406715, Test Loss: 0.1349814236164093\n", "Epoch 8164/10000, Training Loss: 0.020107051357626915, Test Loss: 0.134797140955925\n", "Epoch 8165/10000, Training Loss: 0.020103037357330322, Test Loss: 0.13511011004447937\n", "Epoch 8166/10000, Training Loss: 0.020099064335227013, Test Loss: 0.1347174346446991\n", "Epoch 8167/10000, Training Loss: 0.020095007494091988, Test Loss: 0.13514959812164307\n", "Epoch 8168/10000, Training Loss: 0.020090891048312187, Test Loss: 0.13472647964954376\n", "Epoch 8169/10000, Training Loss: 0.02008674666285515, Test Loss: 0.135105699300766\n", "Epoch 8170/10000, Training Loss: 0.020082490518689156, Test Loss: 0.134803906083107\n", "Epoch 8171/10000, Training Loss: 0.020078273490071297, Test Loss: 0.13501101732254028\n", "Epoch 8172/10000, Training Loss: 0.02007407508790493, Test Loss: 0.1349097341299057\n", "Epoch 8173/10000, Training Loss: 0.020069874823093414, Test Loss: 0.13490772247314453\n", "Epoch 8174/10000, Training Loss: 0.02006573975086212, Test Loss: 0.1350032240152359\n", "Epoch 8175/10000, Training Loss: 0.020061736926436424, Test Loss: 0.13483163714408875\n", "Epoch 8176/10000, Training Loss: 0.020057598128914833, Test Loss: 0.13505733013153076\n", "Epoch 8177/10000, Training Loss: 0.020053541287779808, Test Loss: 0.1347995549440384\n", "Epoch 8178/10000, Training Loss: 0.02004942297935486, Test Loss: 0.1350659728050232\n", "Epoch 8179/10000, Training Loss: 0.020045308396220207, Test Loss: 0.13476580381393433\n", "Epoch 8180/10000, Training Loss: 0.020041294395923615, Test Loss: 0.1351473331451416\n", "Epoch 8181/10000, Training Loss: 0.020037276670336723, Test Loss: 0.13470067083835602\n", "Epoch 8182/10000, Training Loss: 0.0200333409011364, Test Loss: 0.13527075946331024\n", "Epoch 8183/10000, Training Loss: 0.020029377192258835, Test Loss: 0.1346316635608673\n", "Epoch 8184/10000, Training Loss: 0.020025579258799553, Test Loss: 0.1354018598794937\n", "Epoch 8185/10000, Training Loss: 0.020021630451083183, Test Loss: 0.1346333622932434\n", "Epoch 8186/10000, Training Loss: 0.020017510280013084, Test Loss: 0.13539890944957733\n", "Epoch 8187/10000, Training Loss: 0.020013226196169853, Test Loss: 0.13474765419960022\n", "Epoch 8188/10000, Training Loss: 0.020008830353617668, Test Loss: 0.1352900266647339\n", "Epoch 8189/10000, Training Loss: 0.020004374906420708, Test Loss: 0.13493119180202484\n", "Epoch 8190/10000, Training Loss: 0.020000027492642403, Test Loss: 0.1351325660943985\n", "Epoch 8191/10000, Training Loss: 0.019995782524347305, Test Loss: 0.13512159883975983\n", "Epoch 8192/10000, Training Loss: 0.019991682842373848, Test Loss: 0.1349886804819107\n", "Epoch 8193/10000, Training Loss: 0.019987620413303375, Test Loss: 0.13526616990566254\n", "Epoch 8194/10000, Training Loss: 0.019983641803264618, Test Loss: 0.13490070402622223\n", "Epoch 8195/10000, Training Loss: 0.019979646429419518, Test Loss: 0.13533839583396912\n", "Epoch 8196/10000, Training Loss: 0.019975600764155388, Test Loss: 0.13488376140594482\n", "Epoch 8197/10000, Training Loss: 0.01997152529656887, Test Loss: 0.13533759117126465\n", "Epoch 8198/10000, Training Loss: 0.019967319443821907, Test Loss: 0.13492876291275024\n", "Epoch 8199/10000, Training Loss: 0.019963163882493973, Test Loss: 0.13528162240982056\n", "Epoch 8200/10000, Training Loss: 0.01995895244181156, Test Loss: 0.13501311838626862\n", "Epoch 8201/10000, Training Loss: 0.019954698160290718, Test Loss: 0.13519668579101562\n", "Epoch 8202/10000, Training Loss: 0.01995052956044674, Test Loss: 0.13510924577713013\n", "Epoch 8203/10000, Training Loss: 0.019946392625570297, Test Loss: 0.13511131703853607\n", "Epoch 8204/10000, Training Loss: 0.019942300394177437, Test Loss: 0.13519187271595\n", "Epoch 8205/10000, Training Loss: 0.019938159734010696, Test Loss: 0.13504526019096375\n", "Epoch 8206/10000, Training Loss: 0.019934095442295074, Test Loss: 0.13524742424488068\n", "Epoch 8207/10000, Training Loss: 0.019929969683289528, Test Loss: 0.13500843942165375\n", "Epoch 8208/10000, Training Loss: 0.019925953820347786, Test Loss: 0.13527117669582367\n", "Epoch 8209/10000, Training Loss: 0.0199219211935997, Test Loss: 0.1350012868642807\n", "Epoch 8210/10000, Training Loss: 0.019917700439691544, Test Loss: 0.13526669144630432\n", "Epoch 8211/10000, Training Loss: 0.019913623109459877, Test Loss: 0.13497604429721832\n", "Epoch 8212/10000, Training Loss: 0.01990959607064724, Test Loss: 0.13534781336784363\n", "Epoch 8213/10000, Training Loss: 0.019905583932995796, Test Loss: 0.13490213453769684\n", "Epoch 8214/10000, Training Loss: 0.019901681691408157, Test Loss: 0.1354895979166031\n", "Epoch 8215/10000, Training Loss: 0.019897814840078354, Test Loss: 0.13480181992053986\n", "Epoch 8216/10000, Training Loss: 0.01989411748945713, Test Loss: 0.13566111028194427\n", "Epoch 8217/10000, Training Loss: 0.01989036425948143, Test Loss: 0.13474629819393158\n", "Epoch 8218/10000, Training Loss: 0.01988646574318409, Test Loss: 0.13572099804878235\n", "Epoch 8219/10000, Training Loss: 0.019882410764694214, Test Loss: 0.13479451835155487\n", "Epoch 8220/10000, Training Loss: 0.019878119230270386, Test Loss: 0.1356675773859024\n", "Epoch 8221/10000, Training Loss: 0.019873647019267082, Test Loss: 0.13493825495243073\n", "Epoch 8222/10000, Training Loss: 0.01986914500594139, Test Loss: 0.13552600145339966\n", "Epoch 8223/10000, Training Loss: 0.019864657893776894, Test Loss: 0.1351378858089447\n", "Epoch 8224/10000, Training Loss: 0.01986023411154747, Test Loss: 0.13534849882125854\n", "Epoch 8225/10000, Training Loss: 0.01985601894557476, Test Loss: 0.13533611595630646\n", "Epoch 8226/10000, Training Loss: 0.019851941615343094, Test Loss: 0.13519126176834106\n", "Epoch 8227/10000, Training Loss: 0.01984790340065956, Test Loss: 0.13548751175403595\n", "Epoch 8228/10000, Training Loss: 0.019843915477395058, Test Loss: 0.13508887588977814\n", "Epoch 8229/10000, Training Loss: 0.019839994609355927, Test Loss: 0.13557051122188568\n", "Epoch 8230/10000, Training Loss: 0.019836002960801125, Test Loss: 0.1350545734167099\n", "Epoch 8231/10000, Training Loss: 0.019831905141472816, Test Loss: 0.13558262586593628\n", "Epoch 8232/10000, Training Loss: 0.019827814772725105, Test Loss: 0.1350841224193573\n", "Epoch 8233/10000, Training Loss: 0.01982365921139717, Test Loss: 0.13553589582443237\n", "Epoch 8234/10000, Training Loss: 0.01981937326490879, Test Loss: 0.13516004383563995\n", "Epoch 8235/10000, Training Loss: 0.019815200939774513, Test Loss: 0.135453000664711\n", "Epoch 8236/10000, Training Loss: 0.019810983911156654, Test Loss: 0.13525593280792236\n", "Epoch 8237/10000, Training Loss: 0.019806867465376854, Test Loss: 0.13536114990711212\n", "Epoch 8238/10000, Training Loss: 0.01980271190404892, Test Loss: 0.1353478729724884\n", "Epoch 8239/10000, Training Loss: 0.019798563793301582, Test Loss: 0.13528041541576385\n", "Epoch 8240/10000, Training Loss: 0.019794458523392677, Test Loss: 0.13542018830776215\n", "Epoch 8241/10000, Training Loss: 0.01979043148458004, Test Loss: 0.13518361747264862\n", "Epoch 8242/10000, Training Loss: 0.019786469638347626, Test Loss: 0.1355658918619156\n", "Epoch 8243/10000, Training Loss: 0.01978256367146969, Test Loss: 0.1350529044866562\n", "Epoch 8244/10000, Training Loss: 0.01977878250181675, Test Loss: 0.13575130701065063\n", "Epoch 8245/10000, Training Loss: 0.01977512799203396, Test Loss: 0.13491807878017426\n", "Epoch 8246/10000, Training Loss: 0.019771594554185867, Test Loss: 0.1359396129846573\n", "Epoch 8247/10000, Training Loss: 0.019767945632338524, Test Loss: 0.1348169595003128\n", "Epoch 8248/10000, Training Loss: 0.019764505326747894, Test Loss: 0.1360827088356018\n", "Epoch 8249/10000, Training Loss: 0.01976066827774048, Test Loss: 0.13484059274196625\n", "Epoch 8250/10000, Training Loss: 0.019756445661187172, Test Loss: 0.13603006303310394\n", "Epoch 8251/10000, Training Loss: 0.019751738756895065, Test Loss: 0.13504734635353088\n", "Epoch 8252/10000, Training Loss: 0.01974685862660408, Test Loss: 0.13580507040023804\n", "Epoch 8253/10000, Training Loss: 0.019742010161280632, Test Loss: 0.13536952435970306\n", "Epoch 8254/10000, Training Loss: 0.01973731815814972, Test Loss: 0.13551345467567444\n", "Epoch 8255/10000, Training Loss: 0.01973312348127365, Test Loss: 0.13568158447742462\n", "Epoch 8256/10000, Training Loss: 0.019729221239686012, Test Loss: 0.13527856767177582\n", "Epoch 8257/10000, Training Loss: 0.019725456833839417, Test Loss: 0.13588201999664307\n", "Epoch 8258/10000, Training Loss: 0.019721774384379387, Test Loss: 0.13517722487449646\n", "Epoch 8259/10000, Training Loss: 0.019717875868082047, Test Loss: 0.13592790067195892\n", "Epoch 8260/10000, Training Loss: 0.019713731482625008, Test Loss: 0.13522614538669586\n", "Epoch 8261/10000, Training Loss: 0.019709482789039612, Test Loss: 0.13583272695541382\n", "Epoch 8262/10000, Training Loss: 0.01970507949590683, Test Loss: 0.1353355348110199\n", "Epoch 8263/10000, Training Loss: 0.01970088481903076, Test Loss: 0.13578492403030396\n", "Epoch 8264/10000, Training Loss: 0.019696766510605812, Test Loss: 0.13541507720947266\n", "Epoch 8265/10000, Training Loss: 0.019692616537213326, Test Loss: 0.13580191135406494\n", "Epoch 8266/10000, Training Loss: 0.019688569009304047, Test Loss: 0.13545195758342743\n", "Epoch 8267/10000, Training Loss: 0.01968447118997574, Test Loss: 0.1358799934387207\n", "Epoch 8268/10000, Training Loss: 0.01968049444258213, Test Loss: 0.13545586168766022\n", "Epoch 8269/10000, Training Loss: 0.01967647857964039, Test Loss: 0.13599759340286255\n", "Epoch 8270/10000, Training Loss: 0.019672634080052376, Test Loss: 0.1354132741689682\n", "Epoch 8271/10000, Training Loss: 0.019668854773044586, Test Loss: 0.13621477782726288\n", "Epoch 8272/10000, Training Loss: 0.01966523937880993, Test Loss: 0.13534104824066162\n", "Epoch 8273/10000, Training Loss: 0.019661815837025642, Test Loss: 0.1364176869392395\n", "Epoch 8274/10000, Training Loss: 0.019657928496599197, Test Loss: 0.13545867800712585\n", "Epoch 8275/10000, Training Loss: 0.019653532654047012, Test Loss: 0.13639415800571442\n", "Epoch 8276/10000, Training Loss: 0.019648853689432144, Test Loss: 0.13578660786151886\n", "Epoch 8277/10000, Training Loss: 0.01964416354894638, Test Loss: 0.13626350462436676\n", "Epoch 8278/10000, Training Loss: 0.019639702513813972, Test Loss: 0.13610416650772095\n", "Epoch 8279/10000, Training Loss: 0.019635504111647606, Test Loss: 0.13617858290672302\n", "Epoch 8280/10000, Training Loss: 0.019631490111351013, Test Loss: 0.13630472123622894\n", "Epoch 8281/10000, Training Loss: 0.019627468660473824, Test Loss: 0.13619409501552582\n", "Epoch 8282/10000, Training Loss: 0.019623447209596634, Test Loss: 0.13647450506687164\n", "Epoch 8283/10000, Training Loss: 0.019619427621364594, Test Loss: 0.1362820267677307\n", "Epoch 8284/10000, Training Loss: 0.019615396857261658, Test Loss: 0.13664552569389343\n", "Epoch 8285/10000, Training Loss: 0.019611399620771408, Test Loss: 0.13631512224674225\n", "Epoch 8286/10000, Training Loss: 0.019607489928603172, Test Loss: 0.1368420422077179\n", "Epoch 8287/10000, Training Loss: 0.019603583961725235, Test Loss: 0.13634426891803741\n", "Epoch 8288/10000, Training Loss: 0.019599726423621178, Test Loss: 0.13699272274971008\n", "Epoch 8289/10000, Training Loss: 0.019595686346292496, Test Loss: 0.13641048967838287\n", "Epoch 8290/10000, Training Loss: 0.01959170773625374, Test Loss: 0.13708467781543732\n", "Epoch 8291/10000, Training Loss: 0.019587650895118713, Test Loss: 0.13651908934116364\n", "Epoch 8292/10000, Training Loss: 0.019583532586693764, Test Loss: 0.13712692260742188\n", "Epoch 8293/10000, Training Loss: 0.01957927830517292, Test Loss: 0.1366540491580963\n", "Epoch 8294/10000, Training Loss: 0.01957515813410282, Test Loss: 0.13714361190795898\n", "Epoch 8295/10000, Training Loss: 0.019571028649806976, Test Loss: 0.13677802681922913\n", "Epoch 8296/10000, Training Loss: 0.019566969946026802, Test Loss: 0.1371941715478897\n", "Epoch 8297/10000, Training Loss: 0.019562873989343643, Test Loss: 0.1368626356124878\n", "Epoch 8298/10000, Training Loss: 0.019558798521757126, Test Loss: 0.13728459179401398\n", "Epoch 8299/10000, Training Loss: 0.01955479569733143, Test Loss: 0.13691073656082153\n", "Epoch 8300/10000, Training Loss: 0.01955079473555088, Test Loss: 0.13740485906600952\n", "Epoch 8301/10000, Training Loss: 0.019546831026673317, Test Loss: 0.13690364360809326\n", "Epoch 8302/10000, Training Loss: 0.01954309642314911, Test Loss: 0.13761909306049347\n", "Epoch 8303/10000, Training Loss: 0.019539393484592438, Test Loss: 0.1368548572063446\n", "Epoch 8304/10000, Training Loss: 0.019535798579454422, Test Loss: 0.13785843551158905\n", "Epoch 8305/10000, Training Loss: 0.019532205536961555, Test Loss: 0.13683423399925232\n", "Epoch 8306/10000, Training Loss: 0.019528666511178017, Test Loss: 0.13804814219474792\n", "Epoch 8307/10000, Training Loss: 0.019524769857525826, Test Loss: 0.136909618973732\n", "Epoch 8308/10000, Training Loss: 0.019520888105034828, Test Loss: 0.13812945783138275\n", "Epoch 8309/10000, Training Loss: 0.019516466185450554, Test Loss: 0.13711394369602203\n", "Epoch 8310/10000, Training Loss: 0.019512051716446877, Test Loss: 0.13810181617736816\n", "Epoch 8311/10000, Training Loss: 0.01950736716389656, Test Loss: 0.1374487578868866\n", "Epoch 8312/10000, Training Loss: 0.0195026658475399, Test Loss: 0.1379321664571762\n", "Epoch 8313/10000, Training Loss: 0.01949816197156906, Test Loss: 0.13781438767910004\n", "Epoch 8314/10000, Training Loss: 0.019493941217660904, Test Loss: 0.13779236376285553\n", "Epoch 8315/10000, Training Loss: 0.019489960744976997, Test Loss: 0.13808956742286682\n", "Epoch 8316/10000, Training Loss: 0.019486116245388985, Test Loss: 0.13767629861831665\n", "Epoch 8317/10000, Training Loss: 0.019482428207993507, Test Loss: 0.138326495885849\n", "Epoch 8318/10000, Training Loss: 0.01947871595621109, Test Loss: 0.13760781288146973\n", "Epoch 8319/10000, Training Loss: 0.01947515644133091, Test Loss: 0.13852447271347046\n", "Epoch 8320/10000, Training Loss: 0.019471367821097374, Test Loss: 0.1376330405473709\n", "Epoch 8321/10000, Training Loss: 0.019467534497380257, Test Loss: 0.1386210173368454\n", "Epoch 8322/10000, Training Loss: 0.019463347271084785, Test Loss: 0.13778254389762878\n", "Epoch 8323/10000, Training Loss: 0.019459085538983345, Test Loss: 0.1386168897151947\n", "Epoch 8324/10000, Training Loss: 0.019454708322882652, Test Loss: 0.13801467418670654\n", "Epoch 8325/10000, Training Loss: 0.019450360909104347, Test Loss: 0.1385819911956787\n", "Epoch 8326/10000, Training Loss: 0.019446073099970818, Test Loss: 0.1382710337638855\n", "Epoch 8327/10000, Training Loss: 0.01944182999432087, Test Loss: 0.13852635025978088\n", "Epoch 8328/10000, Training Loss: 0.01943761296570301, Test Loss: 0.138479083776474\n", "Epoch 8329/10000, Training Loss: 0.019433563575148582, Test Loss: 0.13854433596134186\n", "Epoch 8330/10000, Training Loss: 0.019429579377174377, Test Loss: 0.13859006762504578\n", "Epoch 8331/10000, Training Loss: 0.019425569102168083, Test Loss: 0.13860255479812622\n", "Epoch 8332/10000, Training Loss: 0.019421571865677834, Test Loss: 0.1386939436197281\n", "Epoch 8333/10000, Training Loss: 0.019417520612478256, Test Loss: 0.13871018588542938\n", "Epoch 8334/10000, Training Loss: 0.019413508474826813, Test Loss: 0.13869266211986542\n", "Epoch 8335/10000, Training Loss: 0.01940949261188507, Test Loss: 0.13885653018951416\n", "Epoch 8336/10000, Training Loss: 0.019405514001846313, Test Loss: 0.1387125700712204\n", "Epoch 8337/10000, Training Loss: 0.019401609897613525, Test Loss: 0.1390826404094696\n", "Epoch 8338/10000, Training Loss: 0.019397763535380363, Test Loss: 0.13865326344966888\n", "Epoch 8339/10000, Training Loss: 0.019394000992178917, Test Loss: 0.13934621214866638\n", "Epoch 8340/10000, Training Loss: 0.019390404224395752, Test Loss: 0.1386108100414276\n", "Epoch 8341/10000, Training Loss: 0.01938677206635475, Test Loss: 0.13955461978912354\n", "Epoch 8342/10000, Training Loss: 0.019382992759346962, Test Loss: 0.1386614292860031\n", "Epoch 8343/10000, Training Loss: 0.01937909983098507, Test Loss: 0.13965390622615814\n", "Epoch 8344/10000, Training Loss: 0.019374936819076538, Test Loss: 0.13883022964000702\n", "Epoch 8345/10000, Training Loss: 0.01937064155936241, Test Loss: 0.13965269923210144\n", "Epoch 8346/10000, Training Loss: 0.019366204738616943, Test Loss: 0.1390707790851593\n", "Epoch 8347/10000, Training Loss: 0.019361959770321846, Test Loss: 0.13962332904338837\n", "Epoch 8348/10000, Training Loss: 0.019357655197381973, Test Loss: 0.13933706283569336\n", "Epoch 8349/10000, Training Loss: 0.019353432580828667, Test Loss: 0.13954822719097137\n", "Epoch 8350/10000, Training Loss: 0.01934932917356491, Test Loss: 0.13956350088119507\n", "Epoch 8351/10000, Training Loss: 0.019345290958881378, Test Loss: 0.139492005109787\n", "Epoch 8352/10000, Training Loss: 0.019341377541422844, Test Loss: 0.13980884850025177\n", "Epoch 8353/10000, Training Loss: 0.019337568432092667, Test Loss: 0.13941997289657593\n", "Epoch 8354/10000, Training Loss: 0.019333701580762863, Test Loss: 0.14000719785690308\n", "Epoch 8355/10000, Training Loss: 0.01932997815310955, Test Loss: 0.13938768208026886\n", "Epoch 8356/10000, Training Loss: 0.01932617276906967, Test Loss: 0.14017629623413086\n", "Epoch 8357/10000, Training Loss: 0.01932239904999733, Test Loss: 0.1394224911928177\n", "Epoch 8358/10000, Training Loss: 0.0193184781819582, Test Loss: 0.1402701586484909\n", "Epoch 8359/10000, Training Loss: 0.019314441829919815, Test Loss: 0.13954636454582214\n", "Epoch 8360/10000, Training Loss: 0.01931021921336651, Test Loss: 0.1402912139892578\n", "Epoch 8361/10000, Training Loss: 0.019306059926748276, Test Loss: 0.13972879946231842\n", "Epoch 8362/10000, Training Loss: 0.019301792606711388, Test Loss: 0.14028845727443695\n", "Epoch 8363/10000, Training Loss: 0.019297650083899498, Test Loss: 0.13990753889083862\n", "Epoch 8364/10000, Training Loss: 0.019293470308184624, Test Loss: 0.14032045006752014\n", "Epoch 8365/10000, Training Loss: 0.019289446994662285, Test Loss: 0.14003506302833557\n", "Epoch 8366/10000, Training Loss: 0.019285397604107857, Test Loss: 0.14038735628128052\n", "Epoch 8367/10000, Training Loss: 0.019281376153230667, Test Loss: 0.14017252624034882\n", "Epoch 8368/10000, Training Loss: 0.019277328625321388, Test Loss: 0.14046648144721985\n", "Epoch 8369/10000, Training Loss: 0.01927327923476696, Test Loss: 0.14032548666000366\n", "Epoch 8370/10000, Training Loss: 0.01926928572356701, Test Loss: 0.14060631394386292\n", "Epoch 8371/10000, Training Loss: 0.0192653089761734, Test Loss: 0.14037294685840607\n", "Epoch 8372/10000, Training Loss: 0.019261399284005165, Test Loss: 0.14082038402557373\n", "Epoch 8373/10000, Training Loss: 0.01925763674080372, Test Loss: 0.1403697282075882\n", "Epoch 8374/10000, Training Loss: 0.019253814592957497, Test Loss: 0.14104311168193817\n", "Epoch 8375/10000, Training Loss: 0.019250022247433662, Test Loss: 0.14038382470607758\n", "Epoch 8376/10000, Training Loss: 0.019246282055974007, Test Loss: 0.14121362566947937\n", "Epoch 8377/10000, Training Loss: 0.019242411479353905, Test Loss: 0.14046461880207062\n", "Epoch 8378/10000, Training Loss: 0.019238432869315147, Test Loss: 0.14130263030529022\n", "Epoch 8379/10000, Training Loss: 0.01923426240682602, Test Loss: 0.1406189352273941\n", "Epoch 8380/10000, Training Loss: 0.01923011988401413, Test Loss: 0.14132815599441528\n", "Epoch 8381/10000, Training Loss: 0.019225889816880226, Test Loss: 0.14080961048603058\n", "Epoch 8382/10000, Training Loss: 0.01922173984348774, Test Loss: 0.1413399875164032\n", "Epoch 8383/10000, Training Loss: 0.01921754516661167, Test Loss: 0.14098210632801056\n", "Epoch 8384/10000, Training Loss: 0.01921352930366993, Test Loss: 0.1413888782262802\n", "Epoch 8385/10000, Training Loss: 0.019209476187825203, Test Loss: 0.14112597703933716\n", "Epoch 8386/10000, Training Loss: 0.019205408170819283, Test Loss: 0.14142760634422302\n", "Epoch 8387/10000, Training Loss: 0.019201330840587616, Test Loss: 0.141251340508461\n", "Epoch 8388/10000, Training Loss: 0.019197305664420128, Test Loss: 0.14146097004413605\n", "Epoch 8389/10000, Training Loss: 0.019193308427929878, Test Loss: 0.1413070112466812\n", "Epoch 8390/10000, Training Loss: 0.019189326092600822, Test Loss: 0.14160433411598206\n", "Epoch 8391/10000, Training Loss: 0.01918542943894863, Test Loss: 0.14127974212169647\n", "Epoch 8392/10000, Training Loss: 0.019181590527296066, Test Loss: 0.14180399477481842\n", "Epoch 8393/10000, Training Loss: 0.01917785219848156, Test Loss: 0.14123539626598358\n", "Epoch 8394/10000, Training Loss: 0.01917407289147377, Test Loss: 0.14198903739452362\n", "Epoch 8395/10000, Training Loss: 0.01917039044201374, Test Loss: 0.1412394791841507\n", "Epoch 8396/10000, Training Loss: 0.019166558980941772, Test Loss: 0.1421061009168625\n", "Epoch 8397/10000, Training Loss: 0.019162608310580254, Test Loss: 0.1413269191980362\n", "Epoch 8398/10000, Training Loss: 0.019158482551574707, Test Loss: 0.14214201271533966\n", "Epoch 8399/10000, Training Loss: 0.01915435492992401, Test Loss: 0.14148585498332977\n", "Epoch 8400/10000, Training Loss: 0.01915004476904869, Test Loss: 0.1420929878950119\n", "Epoch 8401/10000, Training Loss: 0.01914578303694725, Test Loss: 0.1417580395936966\n", "Epoch 8402/10000, Training Loss: 0.019141526892781258, Test Loss: 0.1419859379529953\n", "Epoch 8403/10000, Training Loss: 0.019137423485517502, Test Loss: 0.14209413528442383\n", "Epoch 8404/10000, Training Loss: 0.01913343369960785, Test Loss: 0.14192284643650055\n", "Epoch 8405/10000, Training Loss: 0.019129568710923195, Test Loss: 0.1424017995595932\n", "Epoch 8406/10000, Training Loss: 0.019125815480947495, Test Loss: 0.14190199971199036\n", "Epoch 8407/10000, Training Loss: 0.019122131168842316, Test Loss: 0.1426558792591095\n", "Epoch 8408/10000, Training Loss: 0.019118474796414375, Test Loss: 0.14192213118076324\n", "Epoch 8409/10000, Training Loss: 0.019114673137664795, Test Loss: 0.14282259345054626\n", "Epoch 8410/10000, Training Loss: 0.01911081187427044, Test Loss: 0.14203032851219177\n", "Epoch 8411/10000, Training Loss: 0.019106749445199966, Test Loss: 0.14288482069969177\n", "Epoch 8412/10000, Training Loss: 0.019102567806839943, Test Loss: 0.1422184705734253\n", "Epoch 8413/10000, Training Loss: 0.019098352640867233, Test Loss: 0.14287865161895752\n", "Epoch 8414/10000, Training Loss: 0.01909409835934639, Test Loss: 0.14243124425411224\n", "Epoch 8415/10000, Training Loss: 0.01908993348479271, Test Loss: 0.1428724229335785\n", "Epoch 8416/10000, Training Loss: 0.019085757434368134, Test Loss: 0.14262832701206207\n", "Epoch 8417/10000, Training Loss: 0.01908169873058796, Test Loss: 0.14282295107841492\n", "Epoch 8418/10000, Training Loss: 0.019077587872743607, Test Loss: 0.14283806085586548\n", "Epoch 8419/10000, Training Loss: 0.019073612987995148, Test Loss: 0.14285396039485931\n", "Epoch 8420/10000, Training Loss: 0.019069675356149673, Test Loss: 0.14298734068870544\n", "Epoch 8421/10000, Training Loss: 0.019065655767917633, Test Loss: 0.142869770526886\n", "Epoch 8422/10000, Training Loss: 0.019061757251620293, Test Loss: 0.14320513606071472\n", "Epoch 8423/10000, Training Loss: 0.01905790902674198, Test Loss: 0.14281505346298218\n", "Epoch 8424/10000, Training Loss: 0.019054222851991653, Test Loss: 0.1434425264596939\n", "Epoch 8425/10000, Training Loss: 0.019050532951951027, Test Loss: 0.142772376537323\n", "Epoch 8426/10000, Training Loss: 0.019046785309910774, Test Loss: 0.1436242163181305\n", "Epoch 8427/10000, Training Loss: 0.019043084233999252, Test Loss: 0.14280535280704498\n", "Epoch 8428/10000, Training Loss: 0.019039174541831017, Test Loss: 0.14370974898338318\n", "Epoch 8429/10000, Training Loss: 0.019035130739212036, Test Loss: 0.1429317444562912\n", "Epoch 8430/10000, Training Loss: 0.019030911847949028, Test Loss: 0.14367255568504333\n", "Epoch 8431/10000, Training Loss: 0.01902659609913826, Test Loss: 0.14320652186870575\n", "Epoch 8432/10000, Training Loss: 0.01902216300368309, Test Loss: 0.14356331527233124\n", "Epoch 8433/10000, Training Loss: 0.019018001854419708, Test Loss: 0.1435064673423767\n", "Epoch 8434/10000, Training Loss: 0.019013939425349236, Test Loss: 0.14347770810127258\n", "Epoch 8435/10000, Training Loss: 0.019010024145245552, Test Loss: 0.14378662407398224\n", "Epoch 8436/10000, Training Loss: 0.01900612935423851, Test Loss: 0.1434864103794098\n", "Epoch 8437/10000, Training Loss: 0.01900232955813408, Test Loss: 0.14400452375411987\n", "Epoch 8438/10000, Training Loss: 0.01899847202003002, Test Loss: 0.1435416340827942\n", "Epoch 8439/10000, Training Loss: 0.018994605168700218, Test Loss: 0.14415687322616577\n", "Epoch 8440/10000, Training Loss: 0.01899064891040325, Test Loss: 0.14366573095321655\n", "Epoch 8441/10000, Training Loss: 0.018986670300364494, Test Loss: 0.14426672458648682\n", "Epoch 8442/10000, Training Loss: 0.018982641398906708, Test Loss: 0.1437797099351883\n", "Epoch 8443/10000, Training Loss: 0.018978608772158623, Test Loss: 0.14436709880828857\n", "Epoch 8444/10000, Training Loss: 0.018974676728248596, Test Loss: 0.14388296008110046\n", "Epoch 8445/10000, Training Loss: 0.018970753997564316, Test Loss: 0.1444624811410904\n", "Epoch 8446/10000, Training Loss: 0.018966734409332275, Test Loss: 0.14397521317005157\n", "Epoch 8447/10000, Training Loss: 0.018962768837809563, Test Loss: 0.14455440640449524\n", "Epoch 8448/10000, Training Loss: 0.018958797678351402, Test Loss: 0.1440582126379013\n", "Epoch 8449/10000, Training Loss: 0.018954843282699585, Test Loss: 0.14460937678813934\n", "Epoch 8450/10000, Training Loss: 0.018950730562210083, Test Loss: 0.14421901106834412\n", "Epoch 8451/10000, Training Loss: 0.018946651369333267, Test Loss: 0.14462091028690338\n", "Epoch 8452/10000, Training Loss: 0.0189425777643919, Test Loss: 0.14441032707691193\n", "Epoch 8453/10000, Training Loss: 0.018938496708869934, Test Loss: 0.14463771879673004\n", "Epoch 8454/10000, Training Loss: 0.018934478983283043, Test Loss: 0.14458292722702026\n", "Epoch 8455/10000, Training Loss: 0.01893049106001854, Test Loss: 0.144706591963768\n", "Epoch 8456/10000, Training Loss: 0.0189264714717865, Test Loss: 0.14472073316574097\n", "Epoch 8457/10000, Training Loss: 0.018922530114650726, Test Loss: 0.1447591781616211\n", "Epoch 8458/10000, Training Loss: 0.018918510526418686, Test Loss: 0.14481063187122345\n", "Epoch 8459/10000, Training Loss: 0.01891457475721836, Test Loss: 0.14486359059810638\n", "Epoch 8460/10000, Training Loss: 0.018910622224211693, Test Loss: 0.14491549134254456\n", "Epoch 8461/10000, Training Loss: 0.018906621262431145, Test Loss: 0.14500343799591064\n", "Epoch 8462/10000, Training Loss: 0.018902653828263283, Test Loss: 0.14497140049934387\n", "Epoch 8463/10000, Training Loss: 0.01889873668551445, Test Loss: 0.14518320560455322\n", "Epoch 8464/10000, Training Loss: 0.018894759938120842, Test Loss: 0.1450059413909912\n", "Epoch 8465/10000, Training Loss: 0.018890852108597755, Test Loss: 0.14540314674377441\n", "Epoch 8466/10000, Training Loss: 0.018887098878622055, Test Loss: 0.14496704936027527\n", "Epoch 8467/10000, Training Loss: 0.018883299082517624, Test Loss: 0.14564743638038635\n", "Epoch 8468/10000, Training Loss: 0.01887970231473446, Test Loss: 0.14491280913352966\n", "Epoch 8469/10000, Training Loss: 0.01887614279985428, Test Loss: 0.1458617001771927\n", "Epoch 8470/10000, Training Loss: 0.018872467800974846, Test Loss: 0.1448981761932373\n", "Epoch 8471/10000, Training Loss: 0.018868790939450264, Test Loss: 0.14599980413913727\n", "Epoch 8472/10000, Training Loss: 0.018864940851926804, Test Loss: 0.14496499300003052\n", "Epoch 8473/10000, Training Loss: 0.018860895186662674, Test Loss: 0.14603644609451294\n", "Epoch 8474/10000, Training Loss: 0.018856719136238098, Test Loss: 0.14512266218662262\n", "Epoch 8475/10000, Training Loss: 0.018852369859814644, Test Loss: 0.14596275985240936\n", "Epoch 8476/10000, Training Loss: 0.018847959116101265, Test Loss: 0.1453935205936432\n", "Epoch 8477/10000, Training Loss: 0.01884349249303341, Test Loss: 0.14580176770687103\n", "Epoch 8478/10000, Training Loss: 0.01883920282125473, Test Loss: 0.14575177431106567\n", "Epoch 8479/10000, Training Loss: 0.01883520744740963, Test Loss: 0.14564937353134155\n", "Epoch 8480/10000, Training Loss: 0.018831312656402588, Test Loss: 0.14606498181819916\n", "Epoch 8481/10000, Training Loss: 0.018827572464942932, Test Loss: 0.1456107646226883\n", "Epoch 8482/10000, Training Loss: 0.018823808059096336, Test Loss: 0.14625439047813416\n", "Epoch 8483/10000, Training Loss: 0.0188198983669281, Test Loss: 0.1457255333662033\n", "Epoch 8484/10000, Training Loss: 0.018815873190760612, Test Loss: 0.14631299674510956\n", "Epoch 8485/10000, Training Loss: 0.018811756744980812, Test Loss: 0.14596541225910187\n", "Epoch 8486/10000, Training Loss: 0.01880764029920101, Test Loss: 0.14629191160202026\n", "Epoch 8487/10000, Training Loss: 0.01880352757871151, Test Loss: 0.14625680446624756\n", "Epoch 8488/10000, Training Loss: 0.01879950985312462, Test Loss: 0.1462666541337967\n", "Epoch 8489/10000, Training Loss: 0.018795564770698547, Test Loss: 0.14654795825481415\n", "Epoch 8490/10000, Training Loss: 0.01879166066646576, Test Loss: 0.14624041318893433\n", "Epoch 8491/10000, Training Loss: 0.01878795213997364, Test Loss: 0.14682896435260773\n", "Epoch 8492/10000, Training Loss: 0.018784290179610252, Test Loss: 0.14619134366512299\n", "Epoch 8493/10000, Training Loss: 0.01878071390092373, Test Loss: 0.1470695585012436\n", "Epoch 8494/10000, Training Loss: 0.018777042627334595, Test Loss: 0.14618273079395294\n", "Epoch 8495/10000, Training Loss: 0.018773365765810013, Test Loss: 0.14722268283367157\n", "Epoch 8496/10000, Training Loss: 0.01876961439847946, Test Loss: 0.14625462889671326\n", "Epoch 8497/10000, Training Loss: 0.01876555196940899, Test Loss: 0.14726832509040833\n", "Epoch 8498/10000, Training Loss: 0.018761388957500458, Test Loss: 0.14641308784484863\n", "Epoch 8499/10000, Training Loss: 0.01875712163746357, Test Loss: 0.14722654223442078\n", "Epoch 8500/10000, Training Loss: 0.01875288411974907, Test Loss: 0.14661861956119537\n", "Epoch 8501/10000, Training Loss: 0.018748559057712555, Test Loss: 0.14710451662540436\n", "Epoch 8502/10000, Training Loss: 0.018744204193353653, Test Loss: 0.14693553745746613\n", "Epoch 8503/10000, Training Loss: 0.018740106374025345, Test Loss: 0.1469579041004181\n", "Epoch 8504/10000, Training Loss: 0.018736183643341064, Test Loss: 0.1472388058900833\n", "Epoch 8505/10000, Training Loss: 0.018732409924268723, Test Loss: 0.146894171833992\n", "Epoch 8506/10000, Training Loss: 0.018728571012616158, Test Loss: 0.14744210243225098\n", "Epoch 8507/10000, Training Loss: 0.018724793568253517, Test Loss: 0.14696405827999115\n", "Epoch 8508/10000, Training Loss: 0.018720798194408417, Test Loss: 0.1475248783826828\n", "Epoch 8509/10000, Training Loss: 0.018716754391789436, Test Loss: 0.14715538918972015\n", "Epoch 8510/10000, Training Loss: 0.018712632358074188, Test Loss: 0.1475217044353485\n", "Epoch 8511/10000, Training Loss: 0.018708551302552223, Test Loss: 0.14741146564483643\n", "Epoch 8512/10000, Training Loss: 0.018704546615481377, Test Loss: 0.1474970281124115\n", "Epoch 8513/10000, Training Loss: 0.018700584769248962, Test Loss: 0.14766301214694977\n", "Epoch 8514/10000, Training Loss: 0.0186966173350811, Test Loss: 0.14751042425632477\n", "Epoch 8515/10000, Training Loss: 0.018692724406719208, Test Loss: 0.14786173403263092\n", "Epoch 8516/10000, Training Loss: 0.018688835203647614, Test Loss: 0.147592693567276\n", "Epoch 8517/10000, Training Loss: 0.01868491992354393, Test Loss: 0.14799220860004425\n", "Epoch 8518/10000, Training Loss: 0.01868099719285965, Test Loss: 0.1477428376674652\n", "Epoch 8519/10000, Training Loss: 0.018676912412047386, Test Loss: 0.14809563755989075\n", "Epoch 8520/10000, Training Loss: 0.018672969192266464, Test Loss: 0.14786507189273834\n", "Epoch 8521/10000, Training Loss: 0.018669050186872482, Test Loss: 0.14820337295532227\n", "Epoch 8522/10000, Training Loss: 0.01866503618657589, Test Loss: 0.1479773223400116\n", "Epoch 8523/10000, Training Loss: 0.018661119043827057, Test Loss: 0.14831174910068512\n", "Epoch 8524/10000, Training Loss: 0.018657170236110687, Test Loss: 0.14807845652103424\n", "Epoch 8525/10000, Training Loss: 0.018653206527233124, Test Loss: 0.1484290361404419\n", "Epoch 8526/10000, Training Loss: 0.018649306148290634, Test Loss: 0.1481526643037796\n", "Epoch 8527/10000, Training Loss: 0.018645377829670906, Test Loss: 0.14855694770812988\n", "Epoch 8528/10000, Training Loss: 0.018641464412212372, Test Loss: 0.14820954203605652\n", "Epoch 8529/10000, Training Loss: 0.01863752119243145, Test Loss: 0.14867989718914032\n", "Epoch 8530/10000, Training Loss: 0.018633628264069557, Test Loss: 0.14827735722064972\n", "Epoch 8531/10000, Training Loss: 0.018629644066095352, Test Loss: 0.14878694713115692\n", "Epoch 8532/10000, Training Loss: 0.01862575113773346, Test Loss: 0.14835795760154724\n", "Epoch 8533/10000, Training Loss: 0.018621761351823807, Test Loss: 0.14885705709457397\n", "Epoch 8534/10000, Training Loss: 0.018617738038301468, Test Loss: 0.1485038846731186\n", "Epoch 8535/10000, Training Loss: 0.018613694235682487, Test Loss: 0.1488855630159378\n", "Epoch 8536/10000, Training Loss: 0.018609674647450447, Test Loss: 0.14868883788585663\n", "Epoch 8537/10000, Training Loss: 0.018605656921863556, Test Loss: 0.14889799058437347\n", "Epoch 8538/10000, Training Loss: 0.01860162988305092, Test Loss: 0.148884117603302\n", "Epoch 8539/10000, Training Loss: 0.018597688525915146, Test Loss: 0.14891909062862396\n", "Epoch 8540/10000, Training Loss: 0.01859373226761818, Test Loss: 0.14906740188598633\n", "Epoch 8541/10000, Training Loss: 0.018589753657579422, Test Loss: 0.14896397292613983\n", "Epoch 8542/10000, Training Loss: 0.01858583092689514, Test Loss: 0.14922724664211273\n", "Epoch 8543/10000, Training Loss: 0.018581904470920563, Test Loss: 0.1490388661623001\n", "Epoch 8544/10000, Training Loss: 0.018578026443719864, Test Loss: 0.14938722550868988\n", "Epoch 8545/10000, Training Loss: 0.018574092537164688, Test Loss: 0.14907097816467285\n", "Epoch 8546/10000, Training Loss: 0.018570246174931526, Test Loss: 0.14953958988189697\n", "Epoch 8547/10000, Training Loss: 0.018566325306892395, Test Loss: 0.14914652705192566\n", "Epoch 8548/10000, Training Loss: 0.01856236346065998, Test Loss: 0.14965613186359406\n", "Epoch 8549/10000, Training Loss: 0.018558427691459656, Test Loss: 0.14925915002822876\n", "Epoch 8550/10000, Training Loss: 0.018554432317614555, Test Loss: 0.14974470436573029\n", "Epoch 8551/10000, Training Loss: 0.018550453707575798, Test Loss: 0.149398073554039\n", "Epoch 8552/10000, Training Loss: 0.01854642480611801, Test Loss: 0.14981421828269958\n", "Epoch 8553/10000, Training Loss: 0.018542416393756866, Test Loss: 0.14955368638038635\n", "Epoch 8554/10000, Training Loss: 0.018538421019911766, Test Loss: 0.14987459778785706\n", "Epoch 8555/10000, Training Loss: 0.018534373492002487, Test Loss: 0.14971444010734558\n", "Epoch 8556/10000, Training Loss: 0.01853039488196373, Test Loss: 0.14993403851985931\n", "Epoch 8557/10000, Training Loss: 0.018526434898376465, Test Loss: 0.14987371861934662\n", "Epoch 8558/10000, Training Loss: 0.018522439524531364, Test Loss: 0.1499977856874466\n", "Epoch 8559/10000, Training Loss: 0.018518472090363503, Test Loss: 0.15002626180648804\n", "Epoch 8560/10000, Training Loss: 0.018514519557356834, Test Loss: 0.15006934106349945\n", "Epoch 8561/10000, Training Loss: 0.018510567024350166, Test Loss: 0.15017062425613403\n", "Epoch 8562/10000, Training Loss: 0.018506590276956558, Test Loss: 0.15014809370040894\n", "Epoch 8563/10000, Training Loss: 0.018502656370401382, Test Loss: 0.15030759572982788\n", "Epoch 8564/10000, Training Loss: 0.018498681485652924, Test Loss: 0.1502324640750885\n", "Epoch 8565/10000, Training Loss: 0.018494751304388046, Test Loss: 0.15045839548110962\n", "Epoch 8566/10000, Training Loss: 0.018490852788090706, Test Loss: 0.1502685546875\n", "Epoch 8567/10000, Training Loss: 0.01848691515624523, Test Loss: 0.15061967074871063\n", "Epoch 8568/10000, Training Loss: 0.018482940271496773, Test Loss: 0.1503143459558487\n", "Epoch 8569/10000, Training Loss: 0.018479106947779655, Test Loss: 0.15077587962150574\n", "Epoch 8570/10000, Training Loss: 0.018475206568837166, Test Loss: 0.15036366879940033\n", "Epoch 8571/10000, Training Loss: 0.01847132295370102, Test Loss: 0.15093080699443817\n", "Epoch 8572/10000, Training Loss: 0.018467383459210396, Test Loss: 0.15041247010231018\n", "Epoch 8573/10000, Training Loss: 0.018463553860783577, Test Loss: 0.1510891318321228\n", "Epoch 8574/10000, Training Loss: 0.018459707498550415, Test Loss: 0.15045543015003204\n", "Epoch 8575/10000, Training Loss: 0.018455905839800835, Test Loss: 0.15125365555286407\n", "Epoch 8576/10000, Training Loss: 0.018452085554599762, Test Loss: 0.1504916399717331\n", "Epoch 8577/10000, Training Loss: 0.018448330461978912, Test Loss: 0.1514432281255722\n", "Epoch 8578/10000, Training Loss: 0.018444683402776718, Test Loss: 0.15047141909599304\n", "Epoch 8579/10000, Training Loss: 0.01844112202525139, Test Loss: 0.15167097747325897\n", "Epoch 8580/10000, Training Loss: 0.018437722697854042, Test Loss: 0.1504012495279312\n", "Epoch 8581/10000, Training Loss: 0.01843445561826229, Test Loss: 0.15192972123622894\n", "Epoch 8582/10000, Training Loss: 0.01843152567744255, Test Loss: 0.1502814143896103\n", "Epoch 8583/10000, Training Loss: 0.01842854730784893, Test Loss: 0.1521608978509903\n", "Epoch 8584/10000, Training Loss: 0.018425535410642624, Test Loss: 0.15022799372673035\n", "Epoch 8585/10000, Training Loss: 0.01842198520898819, Test Loss: 0.15222986042499542\n", "Epoch 8586/10000, Training Loss: 0.01841781660914421, Test Loss: 0.15039291977882385\n", "Epoch 8587/10000, Training Loss: 0.018412699922919273, Test Loss: 0.15198075771331787\n", "Epoch 8588/10000, Training Loss: 0.018406694754958153, Test Loss: 0.15092463791370392\n", "Epoch 8589/10000, Training Loss: 0.018400900065898895, Test Loss: 0.1514808088541031\n", "Epoch 8590/10000, Training Loss: 0.01839594915509224, Test Loss: 0.15156963467597961\n", "Epoch 8591/10000, Training Loss: 0.018392115831375122, Test Loss: 0.1510332077741623\n", "Epoch 8592/10000, Training Loss: 0.01838909089565277, Test Loss: 0.15202602744102478\n", "Epoch 8593/10000, Training Loss: 0.018386125564575195, Test Loss: 0.15088286995887756\n", "Epoch 8594/10000, Training Loss: 0.01838274486362934, Test Loss: 0.152137890458107\n", "Epoch 8595/10000, Training Loss: 0.01837850548326969, Test Loss: 0.15110087394714355\n", "Epoch 8596/10000, Training Loss: 0.018373874947428703, Test Loss: 0.15193384885787964\n", "Epoch 8597/10000, Training Loss: 0.018369095399975777, Test Loss: 0.15155690908432007\n", "Epoch 8598/10000, Training Loss: 0.018364617601037025, Test Loss: 0.15161526203155518\n", "Epoch 8599/10000, Training Loss: 0.01836067996919155, Test Loss: 0.15200138092041016\n", "Epoch 8600/10000, Training Loss: 0.018357152119278908, Test Loss: 0.15142011642456055\n", "Epoch 8601/10000, Training Loss: 0.01835365779697895, Test Loss: 0.15224291384220123\n", "Epoch 8602/10000, Training Loss: 0.01835000328719616, Test Loss: 0.15147487819194794\n", "Epoch 8603/10000, Training Loss: 0.01834605447947979, Test Loss: 0.15223664045333862\n", "Epoch 8604/10000, Training Loss: 0.018341755494475365, Test Loss: 0.1517484039068222\n", "Epoch 8605/10000, Training Loss: 0.018337421119213104, Test Loss: 0.15207690000534058\n", "Epoch 8606/10000, Training Loss: 0.01833326369524002, Test Loss: 0.1520979404449463\n", "Epoch 8607/10000, Training Loss: 0.0183293167501688, Test Loss: 0.15192267298698425\n", "Epoch 8608/10000, Training Loss: 0.01832554303109646, Test Loss: 0.1523692011833191\n", "Epoch 8609/10000, Training Loss: 0.018321892246603966, Test Loss: 0.15189874172210693\n", "Epoch 8610/10000, Training Loss: 0.01831802725791931, Test Loss: 0.15248359739780426\n", "Epoch 8611/10000, Training Loss: 0.018314136192202568, Test Loss: 0.15203456580638885\n", "Epoch 8612/10000, Training Loss: 0.018310073763132095, Test Loss: 0.1524595320224762\n", "Epoch 8613/10000, Training Loss: 0.01830594800412655, Test Loss: 0.15227261185646057\n", "Epoch 8614/10000, Training Loss: 0.018301915377378464, Test Loss: 0.15237994492053986\n", "Epoch 8615/10000, Training Loss: 0.01829800382256508, Test Loss: 0.15251672267913818\n", "Epoch 8616/10000, Training Loss: 0.018294164910912514, Test Loss: 0.15233653783798218\n", "Epoch 8617/10000, Training Loss: 0.018290264531970024, Test Loss: 0.1526915729045868\n", "Epoch 8618/10000, Training Loss: 0.018286406993865967, Test Loss: 0.15238119661808014\n", "Epoch 8619/10000, Training Loss: 0.018282560631632805, Test Loss: 0.1527722328901291\n", "Epoch 8620/10000, Training Loss: 0.01827860437333584, Test Loss: 0.15251389145851135\n", "Epoch 8621/10000, Training Loss: 0.018274594098329544, Test Loss: 0.15278048813343048\n", "Epoch 8622/10000, Training Loss: 0.018270619213581085, Test Loss: 0.15269529819488525\n", "Epoch 8623/10000, Training Loss: 0.018266642466187477, Test Loss: 0.15276533365249634\n", "Epoch 8624/10000, Training Loss: 0.018262773752212524, Test Loss: 0.15287412703037262\n", "Epoch 8625/10000, Training Loss: 0.018258847296237946, Test Loss: 0.15277378261089325\n", "Epoch 8626/10000, Training Loss: 0.018254946917295456, Test Loss: 0.153012216091156\n", "Epoch 8627/10000, Training Loss: 0.018251067027449608, Test Loss: 0.1528310477733612\n", "Epoch 8628/10000, Training Loss: 0.018247151747345924, Test Loss: 0.1530987173318863\n", "Epoch 8629/10000, Training Loss: 0.0182432159781456, Test Loss: 0.1529359370470047\n", "Epoch 8630/10000, Training Loss: 0.01823926344513893, Test Loss: 0.15314427018165588\n", "Epoch 8631/10000, Training Loss: 0.018235327675938606, Test Loss: 0.15307043492794037\n", "Epoch 8632/10000, Training Loss: 0.01823139190673828, Test Loss: 0.15317142009735107\n", "Epoch 8633/10000, Training Loss: 0.018227489665150642, Test Loss: 0.15321019291877747\n", "Epoch 8634/10000, Training Loss: 0.01822357065975666, Test Loss: 0.15320400893688202\n", "Epoch 8635/10000, Training Loss: 0.01821967028081417, Test Loss: 0.1533350944519043\n", "Epoch 8636/10000, Training Loss: 0.018215712159872055, Test Loss: 0.1532566100358963\n", "Epoch 8637/10000, Training Loss: 0.018211809918284416, Test Loss: 0.15343685448169708\n", "Epoch 8638/10000, Training Loss: 0.018207823857665062, Test Loss: 0.15333260595798492\n", "Epoch 8639/10000, Training Loss: 0.01820395700633526, Test Loss: 0.15351572632789612\n", "Epoch 8640/10000, Training Loss: 0.018199993297457695, Test Loss: 0.15342901647090912\n", "Epoch 8641/10000, Training Loss: 0.01819603703916073, Test Loss: 0.15357708930969238\n", "Epoch 8642/10000, Training Loss: 0.018192168325185776, Test Loss: 0.1535373330116272\n", "Epoch 8643/10000, Training Loss: 0.0181882344186306, Test Loss: 0.15363018214702606\n", "Epoch 8644/10000, Training Loss: 0.018184302374720573, Test Loss: 0.15364806354045868\n", "Epoch 8645/10000, Training Loss: 0.01818034239113331, Test Loss: 0.1536841243505478\n", "Epoch 8646/10000, Training Loss: 0.01817648485302925, Test Loss: 0.1537543684244156\n", "Epoch 8647/10000, Training Loss: 0.01817248947918415, Test Loss: 0.15374259650707245\n", "Epoch 8648/10000, Training Loss: 0.01816864125430584, Test Loss: 0.1538529098033905\n", "Epoch 8649/10000, Training Loss: 0.018164614215493202, Test Loss: 0.15380962193012238\n", "Epoch 8650/10000, Training Loss: 0.018160738050937653, Test Loss: 0.15394152700901031\n", "Epoch 8651/10000, Training Loss: 0.018156783655285835, Test Loss: 0.15388518571853638\n", "Epoch 8652/10000, Training Loss: 0.018152885138988495, Test Loss: 0.15402232110500336\n", "Epoch 8653/10000, Training Loss: 0.01814900152385235, Test Loss: 0.15396632254123688\n", "Epoch 8654/10000, Training Loss: 0.01814504712820053, Test Loss: 0.154097780585289\n", "Epoch 8655/10000, Training Loss: 0.01814109832048416, Test Loss: 0.15405189990997314\n", "Epoch 8656/10000, Training Loss: 0.018137168139219284, Test Loss: 0.1541689932346344\n", "Epoch 8657/10000, Training Loss: 0.018133200705051422, Test Loss: 0.15413954854011536\n", "Epoch 8658/10000, Training Loss: 0.018129289150238037, Test Loss: 0.1542380303144455\n", "Epoch 8659/10000, Training Loss: 0.018125317990779877, Test Loss: 0.1542271077632904\n", "Epoch 8660/10000, Training Loss: 0.018121439963579178, Test Loss: 0.15430665016174316\n", "Epoch 8661/10000, Training Loss: 0.018117517232894897, Test Loss: 0.15431399643421173\n", "Epoch 8662/10000, Training Loss: 0.018113533034920692, Test Loss: 0.15437577664852142\n", "Epoch 8663/10000, Training Loss: 0.018109597265720367, Test Loss: 0.15439915657043457\n", "Epoch 8664/10000, Training Loss: 0.01810566522181034, Test Loss: 0.15444603562355042\n", "Epoch 8665/10000, Training Loss: 0.018101710826158524, Test Loss: 0.15448221564292908\n", "Epoch 8666/10000, Training Loss: 0.0180977676063776, Test Loss: 0.15451748669147491\n", "Epoch 8667/10000, Training Loss: 0.018093902617692947, Test Loss: 0.1545644998550415\n", "Epoch 8668/10000, Training Loss: 0.018089918419718742, Test Loss: 0.1545884907245636\n", "Epoch 8669/10000, Training Loss: 0.018085971474647522, Test Loss: 0.1546459048986435\n", "Epoch 8670/10000, Training Loss: 0.0180820319801569, Test Loss: 0.15465927124023438\n", "Epoch 8671/10000, Training Loss: 0.018078161403536797, Test Loss: 0.15472641587257385\n", "Epoch 8672/10000, Training Loss: 0.01807416044175625, Test Loss: 0.15472927689552307\n", "Epoch 8673/10000, Training Loss: 0.018070220947265625, Test Loss: 0.15480685234069824\n", "Epoch 8674/10000, Training Loss: 0.018066281452775, Test Loss: 0.1547972708940506\n", "Epoch 8675/10000, Training Loss: 0.018062392249703407, Test Loss: 0.15488801896572113\n", "Epoch 8676/10000, Training Loss: 0.018058400601148605, Test Loss: 0.15486307442188263\n", "Epoch 8677/10000, Training Loss: 0.018054503947496414, Test Loss: 0.15496955811977386\n", "Epoch 8678/10000, Training Loss: 0.01805048994719982, Test Loss: 0.15492647886276245\n", "Epoch 8679/10000, Training Loss: 0.018046636134386063, Test Loss: 0.15505392849445343\n", "Epoch 8680/10000, Training Loss: 0.018042661249637604, Test Loss: 0.15498478710651398\n", "Epoch 8681/10000, Training Loss: 0.018038678914308548, Test Loss: 0.15514303743839264\n", "Epoch 8682/10000, Training Loss: 0.018034791573882103, Test Loss: 0.15503714978694916\n", "Epoch 8683/10000, Training Loss: 0.018030818551778793, Test Loss: 0.155238538980484\n", "Epoch 8684/10000, Training Loss: 0.018026933073997498, Test Loss: 0.15508031845092773\n", "Epoch 8685/10000, Training Loss: 0.018022960051894188, Test Loss: 0.1553449183702469\n", "Epoch 8686/10000, Training Loss: 0.018019109964370728, Test Loss: 0.15510927140712738\n", "Epoch 8687/10000, Training Loss: 0.01801515929400921, Test Loss: 0.15546715259552002\n", "Epoch 8688/10000, Training Loss: 0.01801130920648575, Test Loss: 0.1551176905632019\n", "Epoch 8689/10000, Training Loss: 0.018007399514317513, Test Loss: 0.1556142419576645\n", "Epoch 8690/10000, Training Loss: 0.01800360344350338, Test Loss: 0.15509402751922607\n", "Epoch 8691/10000, Training Loss: 0.017999831587076187, Test Loss: 0.15579935908317566\n", "Epoch 8692/10000, Training Loss: 0.017996158450841904, Test Loss: 0.15502303838729858\n", "Epoch 8693/10000, Training Loss: 0.017992539331316948, Test Loss: 0.15603823959827423\n", "Epoch 8694/10000, Training Loss: 0.017989173531532288, Test Loss: 0.15488800406455994\n", "Epoch 8695/10000, Training Loss: 0.017986083403229713, Test Loss: 0.1563437581062317\n", "Epoch 8696/10000, Training Loss: 0.01798318512737751, Test Loss: 0.1546858847141266\n", "Epoch 8697/10000, Training Loss: 0.017980802804231644, Test Loss: 0.15669362246990204\n", "Epoch 8698/10000, Training Loss: 0.017978478223085403, Test Loss: 0.15447841584682465\n", "Epoch 8699/10000, Training Loss: 0.017976555973291397, Test Loss: 0.15696528553962708\n", "Epoch 8700/10000, Training Loss: 0.01797359064221382, Test Loss: 0.15446589887142181\n", "Epoch 8701/10000, Training Loss: 0.017969921231269836, Test Loss: 0.15691111981868744\n", "Epoch 8702/10000, Training Loss: 0.017963994294404984, Test Loss: 0.15489932894706726\n", "Epoch 8703/10000, Training Loss: 0.017957201227545738, Test Loss: 0.156397745013237\n", "Epoch 8704/10000, Training Loss: 0.017950143665075302, Test Loss: 0.15570418536663055\n", "Epoch 8705/10000, Training Loss: 0.017944460734725, Test Loss: 0.15569597482681274\n", "Epoch 8706/10000, Training Loss: 0.01794067770242691, Test Loss: 0.15645179152488708\n", "Epoch 8707/10000, Training Loss: 0.017938174307346344, Test Loss: 0.15524426102638245\n", "Epoch 8708/10000, Training Loss: 0.017935866490006447, Test Loss: 0.15678617358207703\n", "Epoch 8709/10000, Training Loss: 0.017932618036866188, Test Loss: 0.15529803931713104\n", "Epoch 8710/10000, Training Loss: 0.017928171902894974, Test Loss: 0.15660758316516876\n", "Epoch 8711/10000, Training Loss: 0.01792282611131668, Test Loss: 0.155803844332695\n", "Epoch 8712/10000, Training Loss: 0.01791766658425331, Test Loss: 0.15612907707691193\n", "Epoch 8713/10000, Training Loss: 0.01791323535144329, Test Loss: 0.1564064472913742\n", "Epoch 8714/10000, Training Loss: 0.017909793183207512, Test Loss: 0.15574528276920319\n", "Epoch 8715/10000, Training Loss: 0.0179066713899374, Test Loss: 0.15674512088298798\n", "Epoch 8716/10000, Training Loss: 0.017903348430991173, Test Loss: 0.1557246297597885\n", "Epoch 8717/10000, Training Loss: 0.017899448052048683, Test Loss: 0.15669381618499756\n", "Epoch 8718/10000, Training Loss: 0.017894933000206947, Test Loss: 0.15605755150318146\n", "Epoch 8719/10000, Training Loss: 0.017890412360429764, Test Loss: 0.1563928723335266\n", "Epoch 8720/10000, Training Loss: 0.017886124551296234, Test Loss: 0.15650016069412231\n", "Epoch 8721/10000, Training Loss: 0.01788235269486904, Test Loss: 0.15612202882766724\n", "Epoch 8722/10000, Training Loss: 0.017878906801342964, Test Loss: 0.15678644180297852\n", "Epoch 8723/10000, Training Loss: 0.01787528581917286, Test Loss: 0.15608784556388855\n", "Epoch 8724/10000, Training Loss: 0.017871446907520294, Test Loss: 0.1568046510219574\n", "Epoch 8725/10000, Training Loss: 0.01786731742322445, Test Loss: 0.15630604326725006\n", "Epoch 8726/10000, Training Loss: 0.017863158136606216, Test Loss: 0.15663252770900726\n", "Epoch 8727/10000, Training Loss: 0.017859013751149178, Test Loss: 0.15662653744220734\n", "Epoch 8728/10000, Training Loss: 0.01785510592162609, Test Loss: 0.1564522534608841\n", "Epoch 8729/10000, Training Loss: 0.017851445823907852, Test Loss: 0.1568678468465805\n", "Epoch 8730/10000, Training Loss: 0.017847726121544838, Test Loss: 0.1564113348722458\n", "Epoch 8731/10000, Training Loss: 0.017843902111053467, Test Loss: 0.1569383591413498\n", "Epoch 8732/10000, Training Loss: 0.017839916050434113, Test Loss: 0.15654179453849792\n", "Epoch 8733/10000, Training Loss: 0.017835896462202072, Test Loss: 0.15686635673046112\n", "Epoch 8734/10000, Training Loss: 0.01783186011016369, Test Loss: 0.15676714479923248\n", "Epoch 8735/10000, Training Loss: 0.017827901989221573, Test Loss: 0.15675637125968933\n", "Epoch 8736/10000, Training Loss: 0.01782410591840744, Test Loss: 0.15697304904460907\n", "Epoch 8737/10000, Training Loss: 0.017820298671722412, Test Loss: 0.1567116230726242\n", "Epoch 8738/10000, Training Loss: 0.017816461622714996, Test Loss: 0.15708212554454803\n", "Epoch 8739/10000, Training Loss: 0.017812611535191536, Test Loss: 0.1567762792110443\n", "Epoch 8740/10000, Training Loss: 0.017808709293603897, Test Loss: 0.15708762407302856\n", "Epoch 8741/10000, Training Loss: 0.0178048238158226, Test Loss: 0.15692463517189026\n", "Epoch 8742/10000, Training Loss: 0.0178008284419775, Test Loss: 0.157039076089859\n", "Epoch 8743/10000, Training Loss: 0.017796939238905907, Test Loss: 0.15709282457828522\n", "Epoch 8744/10000, Training Loss: 0.017793044447898865, Test Loss: 0.15700291097164154\n", "Epoch 8745/10000, Training Loss: 0.017789172008633614, Test Loss: 0.1572212427854538\n", "Epoch 8746/10000, Training Loss: 0.017785361036658287, Test Loss: 0.157024085521698\n", "Epoch 8747/10000, Training Loss: 0.0177814569324255, Test Loss: 0.15728460252285004\n", "Epoch 8748/10000, Training Loss: 0.017777565866708755, Test Loss: 0.15710800886154175\n", "Epoch 8749/10000, Training Loss: 0.017773697152733803, Test Loss: 0.1572953462600708\n", "Epoch 8750/10000, Training Loss: 0.017769740894436836, Test Loss: 0.15722918510437012\n", "Epoch 8751/10000, Training Loss: 0.017765823751688004, Test Loss: 0.15728715062141418\n", "Epoch 8752/10000, Training Loss: 0.01776190474629402, Test Loss: 0.15735125541687012\n", "Epoch 8753/10000, Training Loss: 0.01775806024670601, Test Loss: 0.15729306638240814\n", "Epoch 8754/10000, Training Loss: 0.0177542082965374, Test Loss: 0.15744739770889282\n", "Epoch 8755/10000, Training Loss: 0.017750326544046402, Test Loss: 0.15733195841312408\n", "Epoch 8756/10000, Training Loss: 0.017746420577168465, Test Loss: 0.15750785171985626\n", "Epoch 8757/10000, Training Loss: 0.017742550000548363, Test Loss: 0.15740425884723663\n", "Epoch 8758/10000, Training Loss: 0.01773863285779953, Test Loss: 0.15754029154777527\n", "Epoch 8759/10000, Training Loss: 0.017734795808792114, Test Loss: 0.15749689936637878\n", "Epoch 8760/10000, Training Loss: 0.017730841413140297, Test Loss: 0.15756089985370636\n", "Epoch 8761/10000, Training Loss: 0.01772698014974594, Test Loss: 0.15759295225143433\n", "Epoch 8762/10000, Training Loss: 0.01772303320467472, Test Loss: 0.15758508443832397\n", "Epoch 8763/10000, Training Loss: 0.017719177529215813, Test Loss: 0.15767952799797058\n", "Epoch 8764/10000, Training Loss: 0.017715295776724815, Test Loss: 0.15762421488761902\n", "Epoch 8765/10000, Training Loss: 0.01771138235926628, Test Loss: 0.15774816274642944\n", "Epoch 8766/10000, Training Loss: 0.01770751178264618, Test Loss: 0.15768127143383026\n", "Epoch 8767/10000, Training Loss: 0.017703553661704063, Test Loss: 0.1578007936477661\n", "Epoch 8768/10000, Training Loss: 0.01769966259598732, Test Loss: 0.15775184333324432\n", "Epoch 8769/10000, Training Loss: 0.01769580878317356, Test Loss: 0.15784326195716858\n", "Epoch 8770/10000, Training Loss: 0.01769191399216652, Test Loss: 0.15782947838306427\n", "Epoch 8771/10000, Training Loss: 0.017687998712062836, Test Loss: 0.15788190066814423\n", "Epoch 8772/10000, Training Loss: 0.017684098333120346, Test Loss: 0.15790754556655884\n", "Epoch 8773/10000, Training Loss: 0.017680173739790916, Test Loss: 0.1579219251871109\n", "Epoch 8774/10000, Training Loss: 0.01767629012465477, Test Loss: 0.15798160433769226\n", "Epoch 8775/10000, Training Loss: 0.01767239347100258, Test Loss: 0.15796832740306854\n", "Epoch 8776/10000, Training Loss: 0.017668547108769417, Test Loss: 0.15804818272590637\n", "Epoch 8777/10000, Training Loss: 0.017664600163698196, Test Loss: 0.1580217033624649\n", "Epoch 8778/10000, Training Loss: 0.0176607146859169, Test Loss: 0.15810753405094147\n", "Epoch 8779/10000, Training Loss: 0.01765679381787777, Test Loss: 0.15808212757110596\n", "Epoch 8780/10000, Training Loss: 0.017652925103902817, Test Loss: 0.1581609696149826\n", "Epoch 8781/10000, Training Loss: 0.01764899305999279, Test Loss: 0.15814663469791412\n", "Epoch 8782/10000, Training Loss: 0.017645079642534256, Test Loss: 0.15821175277233124\n", "Epoch 8783/10000, Training Loss: 0.01764116995036602, Test Loss: 0.15821275115013123\n", "Epoch 8784/10000, Training Loss: 0.017637280747294426, Test Loss: 0.15826117992401123\n", "Epoch 8785/10000, Training Loss: 0.017633311450481415, Test Loss: 0.15827900171279907\n", "Epoch 8786/10000, Training Loss: 0.017629474401474, Test Loss: 0.15831129252910614\n", "Epoch 8787/10000, Training Loss: 0.01762559823691845, Test Loss: 0.15834403038024902\n", "Epoch 8788/10000, Training Loss: 0.017621664330363274, Test Loss: 0.1583627462387085\n", "Epoch 8789/10000, Training Loss: 0.017617741599678993, Test Loss: 0.15840716660022736\n", "Epoch 8790/10000, Training Loss: 0.01761382818222046, Test Loss: 0.158416286110878\n", "Epoch 8791/10000, Training Loss: 0.017609942704439163, Test Loss: 0.15846851468086243\n", "Epoch 8792/10000, Training Loss: 0.01760600320994854, Test Loss: 0.15847082436084747\n", "Epoch 8793/10000, Training Loss: 0.017602112144231796, Test Loss: 0.15852908790111542\n", "Epoch 8794/10000, Training Loss: 0.01759818010032177, Test Loss: 0.15852652490139008\n", "Epoch 8795/10000, Training Loss: 0.017594296485185623, Test Loss: 0.15858827531337738\n", "Epoch 8796/10000, Training Loss: 0.017590394243597984, Test Loss: 0.15858305990695953\n", "Epoch 8797/10000, Training Loss: 0.017586469650268555, Test Loss: 0.1586471050977707\n", "Epoch 8798/10000, Training Loss: 0.01758253015577793, Test Loss: 0.1586391031742096\n", "Epoch 8799/10000, Training Loss: 0.017578626051545143, Test Loss: 0.15870659053325653\n", "Epoch 8800/10000, Training Loss: 0.0175747349858284, Test Loss: 0.15869535505771637\n", "Epoch 8801/10000, Training Loss: 0.017570801079273224, Test Loss: 0.15876497328281403\n", "Epoch 8802/10000, Training Loss: 0.017566923052072525, Test Loss: 0.15875257551670074\n", "Epoch 8803/10000, Training Loss: 0.01756305620074272, Test Loss: 0.15882331132888794\n", "Epoch 8804/10000, Training Loss: 0.01755906082689762, Test Loss: 0.15880827605724335\n", "Epoch 8805/10000, Training Loss: 0.017555169761180878, Test Loss: 0.15888401865959167\n", "Epoch 8806/10000, Training Loss: 0.017551211640238762, Test Loss: 0.15886181592941284\n", "Epoch 8807/10000, Training Loss: 0.017547279596328735, Test Loss: 0.15894672274589539\n", "Epoch 8808/10000, Training Loss: 0.017543436959385872, Test Loss: 0.15891289710998535\n", "Epoch 8809/10000, Training Loss: 0.017539480701088905, Test Loss: 0.1590115875005722\n", "Epoch 8810/10000, Training Loss: 0.01753557100892067, Test Loss: 0.1589622050523758\n", "Epoch 8811/10000, Training Loss: 0.017531631514430046, Test Loss: 0.15907910466194153\n", "Epoch 8812/10000, Training Loss: 0.017527777701616287, Test Loss: 0.159007266163826\n", "Epoch 8813/10000, Training Loss: 0.017523841932415962, Test Loss: 0.15915198624134064\n", "Epoch 8814/10000, Training Loss: 0.017519932240247726, Test Loss: 0.15904612839221954\n", "Epoch 8815/10000, Training Loss: 0.017515994608402252, Test Loss: 0.15923157334327698\n", "Epoch 8816/10000, Training Loss: 0.01751212775707245, Test Loss: 0.15907655656337738\n", "Epoch 8817/10000, Training Loss: 0.017508169636130333, Test Loss: 0.15932217240333557\n", "Epoch 8818/10000, Training Loss: 0.01750427484512329, Test Loss: 0.15909253060817719\n", "Epoch 8819/10000, Training Loss: 0.01750038005411625, Test Loss: 0.15943147242069244\n", "Epoch 8820/10000, Training Loss: 0.017496498301625252, Test Loss: 0.1590854525566101\n", "Epoch 8821/10000, Training Loss: 0.01749270223081112, Test Loss: 0.15957003831863403\n", "Epoch 8822/10000, Training Loss: 0.017488840967416763, Test Loss: 0.1590418517589569\n", "Epoch 8823/10000, Training Loss: 0.017485128715634346, Test Loss: 0.15975351631641388\n", "Epoch 8824/10000, Training Loss: 0.017481481656432152, Test Loss: 0.15894277393817902\n", "Epoch 8825/10000, Training Loss: 0.017477983608841896, Test Loss: 0.16000278294086456\n", "Epoch 8826/10000, Training Loss: 0.017474688589572906, Test Loss: 0.15876434743404388\n", "Epoch 8827/10000, Training Loss: 0.017471736297011375, Test Loss: 0.16033782064914703\n", "Epoch 8828/10000, Training Loss: 0.017469188198447227, Test Loss: 0.15849721431732178\n", "Epoch 8829/10000, Training Loss: 0.01746721938252449, Test Loss: 0.16073647141456604\n", "Epoch 8830/10000, Training Loss: 0.01746542751789093, Test Loss: 0.15821443498134613\n", "Epoch 8831/10000, Training Loss: 0.01746414601802826, Test Loss: 0.16104334592819214\n", "Epoch 8832/10000, Training Loss: 0.01746147871017456, Test Loss: 0.15818053483963013\n", "Epoch 8833/10000, Training Loss: 0.017457854002714157, Test Loss: 0.1609288454055786\n", "Epoch 8834/10000, Training Loss: 0.017451051622629166, Test Loss: 0.15872624516487122\n", "Epoch 8835/10000, Training Loss: 0.01744314283132553, Test Loss: 0.16023606061935425\n", "Epoch 8836/10000, Training Loss: 0.01743541657924652, Test Loss: 0.1597052961587906\n", "Epoch 8837/10000, Training Loss: 0.017430029809474945, Test Loss: 0.15937186777591705\n", "Epoch 8838/10000, Training Loss: 0.017427116632461548, Test Loss: 0.16052240133285522\n", "Epoch 8839/10000, Training Loss: 0.017425386235117912, Test Loss: 0.15891574323177338\n", "Epoch 8840/10000, Training Loss: 0.017423255369067192, Test Loss: 0.1607433408498764\n", "Epoch 8841/10000, Training Loss: 0.01741925999522209, Test Loss: 0.15914826095104218\n", "Epoch 8842/10000, Training Loss: 0.01741381734609604, Test Loss: 0.16032907366752625\n", "Epoch 8843/10000, Training Loss: 0.01740795001387596, Test Loss: 0.15985925495624542\n", "Epoch 8844/10000, Training Loss: 0.017403138801455498, Test Loss: 0.15969343483448029\n", "Epoch 8845/10000, Training Loss: 0.01739964447915554, Test Loss: 0.16049131751060486\n", "Epoch 8846/10000, Training Loss: 0.01739691011607647, Test Loss: 0.15937210619449615\n", "Epoch 8847/10000, Training Loss: 0.017393935471773148, Test Loss: 0.16064593195915222\n", "Epoch 8848/10000, Training Loss: 0.017389871180057526, Test Loss: 0.15958723425865173\n", "Epoch 8849/10000, Training Loss: 0.017385169863700867, Test Loss: 0.1603301763534546\n", "Epoch 8850/10000, Training Loss: 0.017380371689796448, Test Loss: 0.16012369096279144\n", "Epoch 8851/10000, Training Loss: 0.01737622357904911, Test Loss: 0.15989874303340912\n", "Epoch 8852/10000, Training Loss: 0.017372699454426765, Test Loss: 0.16055749356746674\n", "Epoch 8853/10000, Training Loss: 0.017369430512189865, Test Loss: 0.15973308682441711\n", "Epoch 8854/10000, Training Loss: 0.017365876585245132, Test Loss: 0.16062910854816437\n", "Epoch 8855/10000, Training Loss: 0.01736188493669033, Test Loss: 0.159946471452713\n", "Epoch 8856/10000, Training Loss: 0.017357470467686653, Test Loss: 0.1603914052248001\n", "Epoch 8857/10000, Training Loss: 0.017353249713778496, Test Loss: 0.16034625470638275\n", "Epoch 8858/10000, Training Loss: 0.01734934374690056, Test Loss: 0.16011469066143036\n", "Epoch 8859/10000, Training Loss: 0.017345720902085304, Test Loss: 0.1606418341398239\n", "Epoch 8860/10000, Training Loss: 0.01734215021133423, Test Loss: 0.16004376113414764\n", "Epoch 8861/10000, Training Loss: 0.017338400706648827, Test Loss: 0.16067981719970703\n", "Epoch 8862/10000, Training Loss: 0.017334388568997383, Test Loss: 0.16022798418998718\n", "Epoch 8863/10000, Training Loss: 0.017330333590507507, Test Loss: 0.16051970422267914\n", "Epoch 8864/10000, Training Loss: 0.01732630282640457, Test Loss: 0.16052260994911194\n", "Epoch 8865/10000, Training Loss: 0.017322415485978127, Test Loss: 0.16034702956676483\n", "Epoch 8866/10000, Training Loss: 0.017318757250905037, Test Loss: 0.16073790192604065\n", "Epoch 8867/10000, Training Loss: 0.017315024510025978, Test Loss: 0.1603187620639801\n", "Epoch 8868/10000, Training Loss: 0.017311178147792816, Test Loss: 0.1607784628868103\n", "Epoch 8869/10000, Training Loss: 0.017307335510849953, Test Loss: 0.16046030819416046\n", "Epoch 8870/10000, Training Loss: 0.01730331964790821, Test Loss: 0.16068698465824127\n", "Epoch 8871/10000, Training Loss: 0.017299417406320572, Test Loss: 0.1606774628162384\n", "Epoch 8872/10000, Training Loss: 0.017295515164732933, Test Loss: 0.16058264672756195\n", "Epoch 8873/10000, Training Loss: 0.017291735857725143, Test Loss: 0.1608482301235199\n", "Epoch 8874/10000, Training Loss: 0.017288021743297577, Test Loss: 0.16056792438030243\n", "Epoch 8875/10000, Training Loss: 0.017284179106354713, Test Loss: 0.16090796887874603\n", "Epoch 8876/10000, Training Loss: 0.01728028990328312, Test Loss: 0.16066470742225647\n", "Epoch 8877/10000, Training Loss: 0.01727641373872757, Test Loss: 0.1608746200799942\n", "Epoch 8878/10000, Training Loss: 0.01727250963449478, Test Loss: 0.16082355380058289\n", "Epoch 8879/10000, Training Loss: 0.017268577590584755, Test Loss: 0.16081702709197998\n", "Epoch 8880/10000, Training Loss: 0.01726480759680271, Test Loss: 0.16096919775009155\n", "Epoch 8881/10000, Training Loss: 0.017260996624827385, Test Loss: 0.16080345213413239\n", "Epoch 8882/10000, Training Loss: 0.017257139086723328, Test Loss: 0.16105066239833832\n", "Epoch 8883/10000, Training Loss: 0.017253298312425613, Test Loss: 0.1608617901802063\n", "Epoch 8884/10000, Training Loss: 0.017249414697289467, Test Loss: 0.1610647290945053\n", "Epoch 8885/10000, Training Loss: 0.017245573922991753, Test Loss: 0.16097384691238403\n", "Epoch 8886/10000, Training Loss: 0.017241694033145905, Test Loss: 0.1610455960035324\n", "Epoch 8887/10000, Training Loss: 0.01723785139620304, Test Loss: 0.16109679639339447\n", "Epoch 8888/10000, Training Loss: 0.017234012484550476, Test Loss: 0.1610366404056549\n", "Epoch 8889/10000, Training Loss: 0.017230156809091568, Test Loss: 0.16119232773780823\n", "Epoch 8890/10000, Training Loss: 0.0172263216227293, Test Loss: 0.16106733679771423\n", "Epoch 8891/10000, Training Loss: 0.017222460359334946, Test Loss: 0.16124309599399567\n", "Epoch 8892/10000, Training Loss: 0.01721859537065029, Test Loss: 0.16113996505737305\n", "Epoch 8893/10000, Training Loss: 0.017214735969901085, Test Loss: 0.1612609475851059\n", "Epoch 8894/10000, Training Loss: 0.01721085235476494, Test Loss: 0.16123446822166443\n", "Epoch 8895/10000, Training Loss: 0.01720702461898327, Test Loss: 0.1612687110900879\n", "Epoch 8896/10000, Training Loss: 0.017203137278556824, Test Loss: 0.16132712364196777\n", "Epoch 8897/10000, Training Loss: 0.017199305817484856, Test Loss: 0.16128888726234436\n", "Epoch 8898/10000, Training Loss: 0.017195433378219604, Test Loss: 0.16140075027942657\n", "Epoch 8899/10000, Training Loss: 0.017191598191857338, Test Loss: 0.16133153438568115\n", "Epoch 8900/10000, Training Loss: 0.017187735065817833, Test Loss: 0.16145111620426178\n", "Epoch 8901/10000, Training Loss: 0.017183875665068626, Test Loss: 0.16139622032642365\n", "Epoch 8902/10000, Training Loss: 0.01717999577522278, Test Loss: 0.16148389875888824\n", "Epoch 8903/10000, Training Loss: 0.017176156863570213, Test Loss: 0.161472886800766\n", "Epoch 8904/10000, Training Loss: 0.017172211781144142, Test Loss: 0.16151034832000732\n", "Epoch 8905/10000, Training Loss: 0.017168398946523666, Test Loss: 0.16155025362968445\n", "Epoch 8906/10000, Training Loss: 0.017164556309580803, Test Loss: 0.1615418791770935\n", "Epoch 8907/10000, Training Loss: 0.017160676419734955, Test Loss: 0.1616186946630478\n", "Epoch 8908/10000, Training Loss: 0.01715681329369545, Test Loss: 0.16158436238765717\n", "Epoch 8909/10000, Training Loss: 0.017152953892946243, Test Loss: 0.1616755872964859\n", "Epoch 8910/10000, Training Loss: 0.017149092629551888, Test Loss: 0.16163866221904755\n", "Epoch 8911/10000, Training Loss: 0.017145223915576935, Test Loss: 0.16172143816947937\n", "Epoch 8912/10000, Training Loss: 0.01714136451482773, Test Loss: 0.16170217096805573\n", "Epoch 8913/10000, Training Loss: 0.017137492075562477, Test Loss: 0.16176125407218933\n", "Epoch 8914/10000, Training Loss: 0.01713363267481327, Test Loss: 0.1617693305015564\n", "Epoch 8915/10000, Training Loss: 0.01712978258728981, Test Loss: 0.16180013120174408\n", "Epoch 8916/10000, Training Loss: 0.017125874757766724, Test Loss: 0.1618349552154541\n", "Epoch 8917/10000, Training Loss: 0.017122020944952965, Test Loss: 0.16184212267398834\n", "Epoch 8918/10000, Training Loss: 0.017118120566010475, Test Loss: 0.16189618408679962\n", "Epoch 8919/10000, Training Loss: 0.01711426116526127, Test Loss: 0.16188876330852509\n", "Epoch 8920/10000, Training Loss: 0.017110398039221764, Test Loss: 0.16195321083068848\n", "Epoch 8921/10000, Training Loss: 0.017106518149375916, Test Loss: 0.16194096207618713\n", "Epoch 8922/10000, Training Loss: 0.017102684825658798, Test Loss: 0.16200460493564606\n", "Epoch 8923/10000, Training Loss: 0.01709876023232937, Test Loss: 0.16199740767478943\n", "Epoch 8924/10000, Training Loss: 0.01709485426545143, Test Loss: 0.16205285489559174\n", "Epoch 8925/10000, Training Loss: 0.01709100231528282, Test Loss: 0.16205644607543945\n", "Epoch 8926/10000, Training Loss: 0.017087120562791824, Test Loss: 0.16209998726844788\n", "Epoch 8927/10000, Training Loss: 0.01708332635462284, Test Loss: 0.16211611032485962\n", "Epoch 8928/10000, Training Loss: 0.017079424113035202, Test Loss: 0.16214695572853088\n", "Epoch 8929/10000, Training Loss: 0.0170755535364151, Test Loss: 0.16217540204524994\n", "Epoch 8930/10000, Training Loss: 0.017071649432182312, Test Loss: 0.16219477355480194\n", "Epoch 8931/10000, Training Loss: 0.017067721113562584, Test Loss: 0.162233367562294\n", "Epoch 8932/10000, Training Loss: 0.017063915729522705, Test Loss: 0.16224437952041626\n", "Epoch 8933/10000, Training Loss: 0.017059992998838425, Test Loss: 0.1622897833585739\n", "Epoch 8934/10000, Training Loss: 0.017056085169315338, Test Loss: 0.1622953861951828\n", "Epoch 8935/10000, Training Loss: 0.017052272334694862, Test Loss: 0.16234555840492249\n", "Epoch 8936/10000, Training Loss: 0.017048362642526627, Test Loss: 0.16234682500362396\n", "Epoch 8937/10000, Training Loss: 0.01704449951648712, Test Loss: 0.16240105032920837\n", "Epoch 8938/10000, Training Loss: 0.01704062707722187, Test Loss: 0.1623990684747696\n", "Epoch 8939/10000, Training Loss: 0.017036747187376022, Test Loss: 0.16245535016059875\n", "Epoch 8940/10000, Training Loss: 0.017032815143465996, Test Loss: 0.16245198249816895\n", "Epoch 8941/10000, Training Loss: 0.017028959468007088, Test Loss: 0.16250981390476227\n", "Epoch 8942/10000, Training Loss: 0.017025051638484, Test Loss: 0.16250459849834442\n", "Epoch 8943/10000, Training Loss: 0.017021190375089645, Test Loss: 0.16256509721279144\n", "Epoch 8944/10000, Training Loss: 0.017017260193824768, Test Loss: 0.1625565141439438\n", "Epoch 8945/10000, Training Loss: 0.017013434320688248, Test Loss: 0.16262134909629822\n", "Epoch 8946/10000, Training Loss: 0.017009524628520012, Test Loss: 0.16260704398155212\n", "Epoch 8947/10000, Training Loss: 0.017005635425448418, Test Loss: 0.16267919540405273\n", "Epoch 8948/10000, Training Loss: 0.017001716420054436, Test Loss: 0.16265606880187988\n", "Epoch 8949/10000, Training Loss: 0.016997862607240677, Test Loss: 0.16273881494998932\n", "Epoch 8950/10000, Training Loss: 0.016994008794426918, Test Loss: 0.16270311176776886\n", "Epoch 8951/10000, Training Loss: 0.01699010096490383, Test Loss: 0.16280131042003632\n", "Epoch 8952/10000, Training Loss: 0.016986224800348282, Test Loss: 0.16274704039096832\n", "Epoch 8953/10000, Training Loss: 0.01698235236108303, Test Loss: 0.1628677397966385\n", "Epoch 8954/10000, Training Loss: 0.0169784314930439, Test Loss: 0.16278669238090515\n", "Epoch 8955/10000, Training Loss: 0.016974544152617455, Test Loss: 0.16293801367282867\n", "Epoch 8956/10000, Training Loss: 0.016970625147223473, Test Loss: 0.16282221674919128\n", "Epoch 8957/10000, Training Loss: 0.01696675829589367, Test Loss: 0.16301393508911133\n", "Epoch 8958/10000, Training Loss: 0.01696290820837021, Test Loss: 0.1628504991531372\n", "Epoch 8959/10000, Training Loss: 0.016959043219685555, Test Loss: 0.16309957206249237\n", "Epoch 8960/10000, Training Loss: 0.016955159604549408, Test Loss: 0.1628662347793579\n", "Epoch 8961/10000, Training Loss: 0.016951290890574455, Test Loss: 0.1632014960050583\n", "Epoch 8962/10000, Training Loss: 0.016947414726018906, Test Loss: 0.1628616899251938\n", "Epoch 8963/10000, Training Loss: 0.01694357395172119, Test Loss: 0.16332782804965973\n", "Epoch 8964/10000, Training Loss: 0.016939755529165268, Test Loss: 0.16282692551612854\n", "Epoch 8965/10000, Training Loss: 0.01693602465093136, Test Loss: 0.16349062323570251\n", "Epoch 8966/10000, Training Loss: 0.016932357102632523, Test Loss: 0.16274724900722504\n", "Epoch 8967/10000, Training Loss: 0.016928713768720627, Test Loss: 0.16370683908462524\n", "Epoch 8968/10000, Training Loss: 0.01692531444132328, Test Loss: 0.1626032143831253\n", "Epoch 8969/10000, Training Loss: 0.01692214608192444, Test Loss: 0.16399560868740082\n", "Epoch 8970/10000, Training Loss: 0.016919173300266266, Test Loss: 0.16237817704677582\n", "Epoch 8971/10000, Training Loss: 0.01691669225692749, Test Loss: 0.1643557995557785\n", "Epoch 8972/10000, Training Loss: 0.016914471983909607, Test Loss: 0.16210243105888367\n", "Epoch 8973/10000, Training Loss: 0.016912804916501045, Test Loss: 0.1647057682275772\n", "Epoch 8974/10000, Training Loss: 0.016910703852772713, Test Loss: 0.16193333268165588\n", "Epoch 8975/10000, Training Loss: 0.016908392310142517, Test Loss: 0.16480973362922668\n", "Epoch 8976/10000, Training Loss: 0.016903690993785858, Test Loss: 0.16217544674873352\n", "Epoch 8977/10000, Training Loss: 0.01689782552421093, Test Loss: 0.16440549492835999\n", "Epoch 8978/10000, Training Loss: 0.01689012162387371, Test Loss: 0.16294655203819275\n", "Epoch 8979/10000, Training Loss: 0.01688295044004917, Test Loss: 0.163608580827713\n", "Epoch 8980/10000, Training Loss: 0.01687758043408394, Test Loss: 0.16386426985263824\n", "Epoch 8981/10000, Training Loss: 0.016874274238944054, Test Loss: 0.16291046142578125\n", "Epoch 8982/10000, Training Loss: 0.016872214153409004, Test Loss: 0.16443490982055664\n", "Epoch 8983/10000, Training Loss: 0.016869958490133286, Test Loss: 0.16271796822547913\n", "Epoch 8984/10000, Training Loss: 0.016866683959960938, Test Loss: 0.16442006826400757\n", "Epoch 8985/10000, Training Loss: 0.016861725598573685, Test Loss: 0.1631193310022354\n", "Epoch 8986/10000, Training Loss: 0.016856180503964424, Test Loss: 0.16392740607261658\n", "Epoch 8987/10000, Training Loss: 0.016851061955094337, Test Loss: 0.16380684077739716\n", "Epoch 8988/10000, Training Loss: 0.01684696599841118, Test Loss: 0.16337908804416656\n", "Epoch 8989/10000, Training Loss: 0.016843833029270172, Test Loss: 0.1643131971359253\n", "Epoch 8990/10000, Training Loss: 0.01684095896780491, Test Loss: 0.16317631304264069\n", "Epoch 8991/10000, Training Loss: 0.016837527975440025, Test Loss: 0.16438187658786774\n", "Epoch 8992/10000, Training Loss: 0.016833437606692314, Test Loss: 0.16342632472515106\n", "Epoch 8993/10000, Training Loss: 0.016828764230012894, Test Loss: 0.16407573223114014\n", "Epoch 8994/10000, Training Loss: 0.016824275255203247, Test Loss: 0.1639169603586197\n", "Epoch 8995/10000, Training Loss: 0.01682017557322979, Test Loss: 0.16369661688804626\n", "Epoch 8996/10000, Training Loss: 0.016816643998026848, Test Loss: 0.16430915892124176\n", "Epoch 8997/10000, Training Loss: 0.016813309863209724, Test Loss: 0.16354262828826904\n", "Epoch 8998/10000, Training Loss: 0.01680976152420044, Test Loss: 0.16440144181251526\n", "Epoch 8999/10000, Training Loss: 0.016805846244096756, Test Loss: 0.1637052595615387\n", "Epoch 9000/10000, Training Loss: 0.016801659017801285, Test Loss: 0.16422399878501892\n", "Epoch 9001/10000, Training Loss: 0.016797497868537903, Test Loss: 0.16405066847801208\n", "Epoch 9002/10000, Training Loss: 0.016793537884950638, Test Loss: 0.1639750450849533\n", "Epoch 9003/10000, Training Loss: 0.016789793968200684, Test Loss: 0.16435296833515167\n", "Epoch 9004/10000, Training Loss: 0.0167861245572567, Test Loss: 0.16385839879512787\n", "Epoch 9005/10000, Training Loss: 0.016782505437731743, Test Loss: 0.1644660383462906\n", "Epoch 9006/10000, Training Loss: 0.016778726130723953, Test Loss: 0.16395093500614166\n", "Epoch 9007/10000, Training Loss: 0.016774805262684822, Test Loss: 0.16439011693000793\n", "Epoch 9008/10000, Training Loss: 0.01677076146006584, Test Loss: 0.16418462991714478\n", "Epoch 9009/10000, Training Loss: 0.016766805201768875, Test Loss: 0.16423937678337097\n", "Epoch 9010/10000, Training Loss: 0.016762932762503624, Test Loss: 0.16442200541496277\n", "Epoch 9011/10000, Training Loss: 0.016759216785430908, Test Loss: 0.1641465723514557\n", "Epoch 9012/10000, Training Loss: 0.016755517572164536, Test Loss: 0.1645551174879074\n", "Epoch 9013/10000, Training Loss: 0.016751781105995178, Test Loss: 0.16418305039405823\n", "Epoch 9014/10000, Training Loss: 0.016747860237956047, Test Loss: 0.1645580232143402\n", "Epoch 9015/10000, Training Loss: 0.016743987798690796, Test Loss: 0.16433049738407135\n", "Epoch 9016/10000, Training Loss: 0.01674010045826435, Test Loss: 0.1644858419895172\n", "Epoch 9017/10000, Training Loss: 0.016736216843128204, Test Loss: 0.16451174020767212\n", "Epoch 9018/10000, Training Loss: 0.016732368618249893, Test Loss: 0.16441957652568817\n", "Epoch 9019/10000, Training Loss: 0.01672864519059658, Test Loss: 0.16465260088443756\n", "Epoch 9020/10000, Training Loss: 0.016724804416298866, Test Loss: 0.16441820561885834\n", "Epoch 9021/10000, Training Loss: 0.016721077263355255, Test Loss: 0.16471609473228455\n", "Epoch 9022/10000, Training Loss: 0.01671721413731575, Test Loss: 0.16449540853500366\n", "Epoch 9023/10000, Training Loss: 0.016713343560695648, Test Loss: 0.16471220552921295\n", "Epoch 9024/10000, Training Loss: 0.01670948974788189, Test Loss: 0.1646215319633484\n", "Epoch 9025/10000, Training Loss: 0.01670563407242298, Test Loss: 0.16468177735805511\n", "Epoch 9026/10000, Training Loss: 0.016701780259609222, Test Loss: 0.16475161910057068\n", "Epoch 9027/10000, Training Loss: 0.016697969287633896, Test Loss: 0.16466784477233887\n", "Epoch 9028/10000, Training Loss: 0.016694162040948868, Test Loss: 0.164849653840065\n", "Epoch 9029/10000, Training Loss: 0.016690324991941452, Test Loss: 0.16469578444957733\n", "Epoch 9030/10000, Training Loss: 0.016686566174030304, Test Loss: 0.16490182280540466\n", "Epoch 9031/10000, Training Loss: 0.016682712361216545, Test Loss: 0.16476784646511078\n", "Epoch 9032/10000, Training Loss: 0.016678860411047935, Test Loss: 0.16491782665252686\n", "Epoch 9033/10000, Training Loss: 0.01667500101029873, Test Loss: 0.16486544907093048\n", "Epoch 9034/10000, Training Loss: 0.01667114347219467, Test Loss: 0.16492095589637756\n", "Epoch 9035/10000, Training Loss: 0.01666731759905815, Test Loss: 0.1649642437696457\n", "Epoch 9036/10000, Training Loss: 0.01666346937417984, Test Loss: 0.16493244469165802\n", "Epoch 9037/10000, Training Loss: 0.016659699380397797, Test Loss: 0.16504736244678497\n", "Epoch 9038/10000, Training Loss: 0.01665583625435829, Test Loss: 0.1649654507637024\n", "Epoch 9039/10000, Training Loss: 0.016652006655931473, Test Loss: 0.16510668396949768\n", "Epoch 9040/10000, Training Loss: 0.0166481900960207, Test Loss: 0.1650216430425644\n", "Epoch 9041/10000, Training Loss: 0.016644330695271492, Test Loss: 0.16514597833156586\n", "Epoch 9042/10000, Training Loss: 0.01664053276181221, Test Loss: 0.16509398818016052\n", "Epoch 9043/10000, Training Loss: 0.016636580228805542, Test Loss: 0.16517373919487\n", "Epoch 9044/10000, Training Loss: 0.016632838174700737, Test Loss: 0.16517287492752075\n", "Epoch 9045/10000, Training Loss: 0.016628950834274292, Test Loss: 0.16520008444786072\n", "Epoch 9046/10000, Training Loss: 0.016625121235847473, Test Loss: 0.1652495414018631\n", "Epoch 9047/10000, Training Loss: 0.016621263697743416, Test Loss: 0.16523295640945435\n", "Epoch 9048/10000, Training Loss: 0.01661744900047779, Test Loss: 0.1653166264295578\n", "Epoch 9049/10000, Training Loss: 0.016613570973277092, Test Loss: 0.16527676582336426\n", "Epoch 9050/10000, Training Loss: 0.016609737649559975, Test Loss: 0.16537326574325562\n", "Epoch 9051/10000, Training Loss: 0.016605859622359276, Test Loss: 0.16532976925373077\n", "Epoch 9052/10000, Training Loss: 0.01660209707915783, Test Loss: 0.1654227375984192\n", "Epoch 9053/10000, Training Loss: 0.016598232090473175, Test Loss: 0.1653890162706375\n", "Epoch 9054/10000, Training Loss: 0.016594359651207924, Test Loss: 0.16546635329723358\n", "Epoch 9055/10000, Training Loss: 0.01659051701426506, Test Loss: 0.16545309126377106\n", "Epoch 9056/10000, Training Loss: 0.016586685553193092, Test Loss: 0.1655074805021286\n", "Epoch 9057/10000, Training Loss: 0.01658286713063717, Test Loss: 0.16551801562309265\n", "Epoch 9058/10000, Training Loss: 0.016579030081629753, Test Loss: 0.1655491441488266\n", "Epoch 9059/10000, Training Loss: 0.01657515950500965, Test Loss: 0.16558226943016052\n", "Epoch 9060/10000, Training Loss: 0.016571279615163803, Test Loss: 0.1655920147895813\n", "Epoch 9061/10000, Training Loss: 0.016567451879382133, Test Loss: 0.1656450629234314\n", "Epoch 9062/10000, Training Loss: 0.016563571989536285, Test Loss: 0.16563743352890015\n", "Epoch 9063/10000, Training Loss: 0.016559720039367676, Test Loss: 0.1657046526670456\n", "Epoch 9064/10000, Training Loss: 0.016555888578295708, Test Loss: 0.1656864434480667\n", "Epoch 9065/10000, Training Loss: 0.016552019864320755, Test Loss: 0.1657603681087494\n", "Epoch 9066/10000, Training Loss: 0.01654822751879692, Test Loss: 0.16573937237262726\n", "Epoch 9067/10000, Training Loss: 0.016544343903660774, Test Loss: 0.16581317782402039\n", "Epoch 9068/10000, Training Loss: 0.01654047891497612, Test Loss: 0.16579441726207733\n", "Epoch 9069/10000, Training Loss: 0.016536589711904526, Test Loss: 0.16586509346961975\n", "Epoch 9070/10000, Training Loss: 0.0165327787399292, Test Loss: 0.16585016250610352\n", "Epoch 9071/10000, Training Loss: 0.016528908163309097, Test Loss: 0.16591645777225494\n", "Epoch 9072/10000, Training Loss: 0.01652510091662407, Test Loss: 0.16590701043605804\n", "Epoch 9073/10000, Training Loss: 0.016521194949746132, Test Loss: 0.1659674048423767\n", "Epoch 9074/10000, Training Loss: 0.016517329961061478, Test Loss: 0.16596351563930511\n", "Epoch 9075/10000, Training Loss: 0.016513509675860405, Test Loss: 0.1660190373659134\n", "Epoch 9076/10000, Training Loss: 0.016509635373950005, Test Loss: 0.16602011024951935\n", "Epoch 9077/10000, Training Loss: 0.016505790874361992, Test Loss: 0.1660703867673874\n", "Epoch 9078/10000, Training Loss: 0.01650194078683853, Test Loss: 0.16607636213302612\n", "Epoch 9079/10000, Training Loss: 0.016498029232025146, Test Loss: 0.16612273454666138\n", "Epoch 9080/10000, Training Loss: 0.01649419777095318, Test Loss: 0.16613264381885529\n", "Epoch 9081/10000, Training Loss: 0.016490334644913673, Test Loss: 0.1661745011806488\n", "Epoch 9082/10000, Training Loss: 0.016486452892422676, Test Loss: 0.16618919372558594\n", "Epoch 9083/10000, Training Loss: 0.01648261584341526, Test Loss: 0.16622668504714966\n", "Epoch 9084/10000, Training Loss: 0.0164787657558918, Test Loss: 0.16624543070793152\n", "Epoch 9085/10000, Training Loss: 0.01647491194307804, Test Loss: 0.16628029942512512\n", "Epoch 9086/10000, Training Loss: 0.016470981761813164, Test Loss: 0.16629977524280548\n", "Epoch 9087/10000, Training Loss: 0.01646711304783821, Test Loss: 0.1663358360528946\n", "Epoch 9088/10000, Training Loss: 0.016463279724121094, Test Loss: 0.16635292768478394\n", "Epoch 9089/10000, Training Loss: 0.016459394246339798, Test Loss: 0.1663927137851715\n", "Epoch 9090/10000, Training Loss: 0.016455497592687607, Test Loss: 0.16640476882457733\n", "Epoch 9091/10000, Training Loss: 0.01645171083509922, Test Loss: 0.16645072400569916\n", "Epoch 9092/10000, Training Loss: 0.016447776928544044, Test Loss: 0.1664557307958603\n", "Epoch 9093/10000, Training Loss: 0.01644391007721424, Test Loss: 0.16650983691215515\n", "Epoch 9094/10000, Training Loss: 0.01644010841846466, Test Loss: 0.16650567948818207\n", "Epoch 9095/10000, Training Loss: 0.016436230391263962, Test Loss: 0.16657043993473053\n", "Epoch 9096/10000, Training Loss: 0.016432326287031174, Test Loss: 0.1665540188550949\n", "Epoch 9097/10000, Training Loss: 0.016428466886281967, Test Loss: 0.16663408279418945\n", "Epoch 9098/10000, Training Loss: 0.016424553468823433, Test Loss: 0.16659782826900482\n", "Epoch 9099/10000, Training Loss: 0.016420729458332062, Test Loss: 0.16670335829257965\n", "Epoch 9100/10000, Training Loss: 0.016416853293776512, Test Loss: 0.16663572192192078\n", "Epoch 9101/10000, Training Loss: 0.016412939876317978, Test Loss: 0.16678006947040558\n", "Epoch 9102/10000, Training Loss: 0.0164091307669878, Test Loss: 0.16666357219219208\n", "Epoch 9103/10000, Training Loss: 0.016405291855335236, Test Loss: 0.16687023639678955\n", "Epoch 9104/10000, Training Loss: 0.01640138030052185, Test Loss: 0.16667351126670837\n", "Epoch 9105/10000, Training Loss: 0.016397524625062943, Test Loss: 0.16698355972766876\n", "Epoch 9106/10000, Training Loss: 0.01639370620250702, Test Loss: 0.16665437817573547\n", "Epoch 9107/10000, Training Loss: 0.016389939934015274, Test Loss: 0.16713455319404602\n", "Epoch 9108/10000, Training Loss: 0.016386162489652634, Test Loss: 0.16658513247966766\n", "Epoch 9109/10000, Training Loss: 0.016382528468966484, Test Loss: 0.1673508733510971\n", "Epoch 9110/10000, Training Loss: 0.01637898199260235, Test Loss: 0.16643080115318298\n", "Epoch 9111/10000, Training Loss: 0.01637568697333336, Test Loss: 0.16767312586307526\n", "Epoch 9112/10000, Training Loss: 0.016372771933674812, Test Loss: 0.16614198684692383\n", "Epoch 9113/10000, Training Loss: 0.016370512545108795, Test Loss: 0.16814720630645752\n", "Epoch 9114/10000, Training Loss: 0.016369007527828217, Test Loss: 0.16568510234355927\n", "Epoch 9115/10000, Training Loss: 0.016368936747312546, Test Loss: 0.16874705255031586\n", "Epoch 9116/10000, Training Loss: 0.016369204968214035, Test Loss: 0.16520044207572937\n", "Epoch 9117/10000, Training Loss: 0.0163706187158823, Test Loss: 0.16913609206676483\n", "Epoch 9118/10000, Training Loss: 0.016368018463253975, Test Loss: 0.1652834564447403\n", "Epoch 9119/10000, Training Loss: 0.016362331807613373, Test Loss: 0.16862916946411133\n", "Epoch 9120/10000, Training Loss: 0.01635025255382061, Test Loss: 0.1664796769618988\n", "Epoch 9121/10000, Training Loss: 0.016338804736733437, Test Loss: 0.16723540425300598\n", "Epoch 9122/10000, Training Loss: 0.016332024708390236, Test Loss: 0.1680050790309906\n", "Epoch 9123/10000, Training Loss: 0.016331003978848457, Test Loss: 0.16611066460609436\n", "Epoch 9124/10000, Training Loss: 0.016332142055034637, Test Loss: 0.16865038871765137\n", "Epoch 9125/10000, Training Loss: 0.016329843550920486, Test Loss: 0.16623060405254364\n", "Epoch 9126/10000, Training Loss: 0.016323702409863472, Test Loss: 0.16801492869853973\n", "Epoch 9127/10000, Training Loss: 0.01631537824869156, Test Loss: 0.16739776730537415\n", "Epoch 9128/10000, Training Loss: 0.016309522092342377, Test Loss: 0.166890949010849\n", "Epoch 9129/10000, Training Loss: 0.016307340934872627, Test Loss: 0.16834942996501923\n", "Epoch 9130/10000, Training Loss: 0.01630617119371891, Test Loss: 0.16651973128318787\n", "Epoch 9131/10000, Training Loss: 0.016303133219480515, Test Loss: 0.16824302077293396\n", "Epoch 9132/10000, Training Loss: 0.016297368332743645, Test Loss: 0.16721868515014648\n", "Epoch 9133/10000, Training Loss: 0.016291359439492226, Test Loss: 0.16740699112415314\n", "Epoch 9134/10000, Training Loss: 0.016287192702293396, Test Loss: 0.16811662912368774\n", "Epoch 9135/10000, Training Loss: 0.01628480665385723, Test Loss: 0.16689053177833557\n", "Epoch 9136/10000, Training Loss: 0.01628231443464756, Test Loss: 0.16829648613929749\n", "Epoch 9137/10000, Training Loss: 0.016278104856610298, Test Loss: 0.16724134981632233\n", "Epoch 9138/10000, Training Loss: 0.016273008659482002, Test Loss: 0.16775308549404144\n", "Epoch 9139/10000, Training Loss: 0.016268378123641014, Test Loss: 0.1679779589176178\n", "Epoch 9140/10000, Training Loss: 0.016265006735920906, Test Loss: 0.16724301874637604\n", "Epoch 9141/10000, Training Loss: 0.016262151300907135, Test Loss: 0.1682981550693512\n", "Epoch 9142/10000, Training Loss: 0.016258640214800835, Test Loss: 0.16735972464084625\n", "Epoch 9143/10000, Training Loss: 0.016254369169473648, Test Loss: 0.167998805642128\n", "Epoch 9144/10000, Training Loss: 0.016249841079115868, Test Loss: 0.16791343688964844\n", "Epoch 9145/10000, Training Loss: 0.016245944425463676, Test Loss: 0.16756226122379303\n", "Epoch 9146/10000, Training Loss: 0.01624266989529133, Test Loss: 0.16828659176826477\n", "Epoch 9147/10000, Training Loss: 0.01623925380408764, Test Loss: 0.16752736270427704\n", "Epoch 9148/10000, Training Loss: 0.016235487535595894, Test Loss: 0.16818076372146606\n", "Epoch 9149/10000, Training Loss: 0.01623126119375229, Test Loss: 0.16790609061717987\n", "Epoch 9150/10000, Training Loss: 0.016227254644036293, Test Loss: 0.16784772276878357\n", "Epoch 9151/10000, Training Loss: 0.016223620623350143, Test Loss: 0.1682766228914261\n", "Epoch 9152/10000, Training Loss: 0.016220126301050186, Test Loss: 0.1677248626947403\n", "Epoch 9153/10000, Training Loss: 0.016216523945331573, Test Loss: 0.16831515729427338\n", "Epoch 9154/10000, Training Loss: 0.016212619841098785, Test Loss: 0.16794903576374054\n", "Epoch 9155/10000, Training Loss: 0.01620861515402794, Test Loss: 0.16809812188148499\n", "Epoch 9156/10000, Training Loss: 0.016204873099923134, Test Loss: 0.16827630996704102\n", "Epoch 9157/10000, Training Loss: 0.01620117761194706, Test Loss: 0.16794097423553467\n", "Epoch 9158/10000, Training Loss: 0.01619758829474449, Test Loss: 0.16841243207454681\n", "Epoch 9159/10000, Training Loss: 0.016193868592381477, Test Loss: 0.16803781688213348\n", "Epoch 9160/10000, Training Loss: 0.016189998015761375, Test Loss: 0.16831088066101074\n", "Epoch 9161/10000, Training Loss: 0.016186140477657318, Test Loss: 0.1682939976453781\n", "Epoch 9162/10000, Training Loss: 0.016182370483875275, Test Loss: 0.16816498339176178\n", "Epoch 9163/10000, Training Loss: 0.0161786749958992, Test Loss: 0.1684797704219818\n", "Epoch 9164/10000, Training Loss: 0.016174979507923126, Test Loss: 0.168170765042305\n", "Epoch 9165/10000, Training Loss: 0.016171282157301903, Test Loss: 0.16848339140415192\n", "Epoch 9166/10000, Training Loss: 0.01616745814681053, Test Loss: 0.1683391034603119\n", "Epoch 9167/10000, Training Loss: 0.01616363413631916, Test Loss: 0.1683846116065979\n", "Epoch 9168/10000, Training Loss: 0.016159888356924057, Test Loss: 0.16852989792823792\n", "Epoch 9169/10000, Training Loss: 0.0161561518907547, Test Loss: 0.16833972930908203\n", "Epoch 9170/10000, Training Loss: 0.016152434051036835, Test Loss: 0.16861280798912048\n", "Epoch 9171/10000, Training Loss: 0.016148675233125687, Test Loss: 0.16842462122440338\n", "Epoch 9172/10000, Training Loss: 0.01614491082727909, Test Loss: 0.16858047246932983\n", "Epoch 9173/10000, Training Loss: 0.016141152009367943, Test Loss: 0.16858351230621338\n", "Epoch 9174/10000, Training Loss: 0.01613735780119896, Test Loss: 0.16852912306785583\n", "Epoch 9175/10000, Training Loss: 0.016133686527609825, Test Loss: 0.16870811581611633\n", "Epoch 9176/10000, Training Loss: 0.01612991653382778, Test Loss: 0.16855086386203766\n", "Epoch 9177/10000, Training Loss: 0.01612613908946514, Test Loss: 0.16874352097511292\n", "Epoch 9178/10000, Training Loss: 0.016122357919812202, Test Loss: 0.168655663728714\n", "Epoch 9179/10000, Training Loss: 0.016118625178933144, Test Loss: 0.16872073709964752\n", "Epoch 9180/10000, Training Loss: 0.016114838421344757, Test Loss: 0.16878296434879303\n", "Epoch 9181/10000, Training Loss: 0.016111090779304504, Test Loss: 0.16871336102485657\n", "Epoch 9182/10000, Training Loss: 0.016107358038425446, Test Loss: 0.1688663363456726\n", "Epoch 9183/10000, Training Loss: 0.01610364206135273, Test Loss: 0.16876527667045593\n", "Epoch 9184/10000, Training Loss: 0.016099838539958, Test Loss: 0.1688900589942932\n", "Epoch 9185/10000, Training Loss: 0.01609610579907894, Test Loss: 0.16886469721794128\n", "Epoch 9186/10000, Training Loss: 0.0160923320800066, Test Loss: 0.16888824105262756\n", "Epoch 9187/10000, Training Loss: 0.016088565811514854, Test Loss: 0.1689644455909729\n", "Epoch 9188/10000, Training Loss: 0.01608484797179699, Test Loss: 0.16890810430049896\n", "Epoch 9189/10000, Training Loss: 0.016081031411886215, Test Loss: 0.16902856528759003\n", "Epoch 9190/10000, Training Loss: 0.016077276319265366, Test Loss: 0.1689695566892624\n", "Epoch 9191/10000, Training Loss: 0.01607353612780571, Test Loss: 0.1690545231103897\n", "Epoch 9192/10000, Training Loss: 0.01606973446905613, Test Loss: 0.16905777156352997\n", "Epoch 9193/10000, Training Loss: 0.016065988689661026, Test Loss: 0.16906991600990295\n", "Epoch 9194/10000, Training Loss: 0.016062207520008087, Test Loss: 0.16914121806621552\n", "Epoch 9195/10000, Training Loss: 0.01605846732854843, Test Loss: 0.16910281777381897\n", "Epoch 9196/10000, Training Loss: 0.016054704785346985, Test Loss: 0.1691998690366745\n", "Epoch 9197/10000, Training Loss: 0.01605093479156494, Test Loss: 0.16916337609291077\n", "Epoch 9198/10000, Training Loss: 0.01604713313281536, Test Loss: 0.16923397779464722\n", "Epoch 9199/10000, Training Loss: 0.016043376177549362, Test Loss: 0.16924089193344116\n", "Epoch 9200/10000, Training Loss: 0.016039632260799408, Test Loss: 0.1692611128091812\n", "Epoch 9201/10000, Training Loss: 0.016035834327340126, Test Loss: 0.16931487619876862\n", "Epoch 9202/10000, Training Loss: 0.01603206992149353, Test Loss: 0.16930080950260162\n", "Epoch 9203/10000, Training Loss: 0.016028333455324173, Test Loss: 0.16937309503555298\n", "Epoch 9204/10000, Training Loss: 0.016024556010961533, Test Loss: 0.16935688257217407\n", "Epoch 9205/10000, Training Loss: 0.016020746901631355, Test Loss: 0.16941586136817932\n", "Epoch 9206/10000, Training Loss: 0.01601700857281685, Test Loss: 0.16942551732063293\n", "Epoch 9207/10000, Training Loss: 0.01601329818367958, Test Loss: 0.16945210099220276\n", "Epoch 9208/10000, Training Loss: 0.016009444370865822, Test Loss: 0.16949477791786194\n", "Epoch 9209/10000, Training Loss: 0.016005748882889748, Test Loss: 0.16949336230754852\n", "Epoch 9210/10000, Training Loss: 0.016001908108592033, Test Loss: 0.16955456137657166\n", "Epoch 9211/10000, Training Loss: 0.01599813811480999, Test Loss: 0.16954638063907623\n", "Epoch 9212/10000, Training Loss: 0.0159943588078022, Test Loss: 0.16960303485393524\n", "Epoch 9213/10000, Training Loss: 0.015990637242794037, Test Loss: 0.16960984468460083\n", "Epoch 9214/10000, Training Loss: 0.015986869111657143, Test Loss: 0.1696447879076004\n", "Epoch 9215/10000, Training Loss: 0.01598302274942398, Test Loss: 0.16967560350894928\n", "Epoch 9216/10000, Training Loss: 0.01597927324473858, Test Loss: 0.16968825459480286\n", "Epoch 9217/10000, Training Loss: 0.015975506976246834, Test Loss: 0.16973651945590973\n", "Epoch 9218/10000, Training Loss: 0.015971751883625984, Test Loss: 0.16973936557769775\n", "Epoch 9219/10000, Training Loss: 0.015967948362231255, Test Loss: 0.16968809068202972\n", "Epoch 9220/10000, Training Loss: 0.01596427708864212, Test Loss: 0.17005865275859833\n", "Epoch 9221/10000, Training Loss: 0.01596064493060112, Test Loss: 0.16962161660194397\n", "Epoch 9222/10000, Training Loss: 0.015956982970237732, Test Loss: 0.17022700607776642\n", "Epoch 9223/10000, Training Loss: 0.015953274443745613, Test Loss: 0.16970592737197876\n", "Epoch 9224/10000, Training Loss: 0.015949474647641182, Test Loss: 0.1702396273612976\n", "Epoch 9225/10000, Training Loss: 0.01594557799398899, Test Loss: 0.16990871727466583\n", "Epoch 9226/10000, Training Loss: 0.015941647812724113, Test Loss: 0.17016059160232544\n", "Epoch 9227/10000, Training Loss: 0.01593775860965252, Test Loss: 0.17014969885349274\n", "Epoch 9228/10000, Training Loss: 0.01593392714858055, Test Loss: 0.170077383518219\n", "Epoch 9229/10000, Training Loss: 0.01593022421002388, Test Loss: 0.17034722864627838\n", "Epoch 9230/10000, Training Loss: 0.015926498919725418, Test Loss: 0.1700579822063446\n", "Epoch 9231/10000, Training Loss: 0.015922730788588524, Test Loss: 0.17045782506465912\n", "Epoch 9232/10000, Training Loss: 0.015919018536806107, Test Loss: 0.17010502517223358\n", "Epoch 9233/10000, Training Loss: 0.01591525226831436, Test Loss: 0.17053169012069702\n", "Epoch 9234/10000, Training Loss: 0.015911424532532692, Test Loss: 0.17019639909267426\n", "Epoch 9235/10000, Training Loss: 0.01590763032436371, Test Loss: 0.1705583781003952\n", "Epoch 9236/10000, Training Loss: 0.015903813764452934, Test Loss: 0.17036382853984833\n", "Epoch 9237/10000, Training Loss: 0.015899956226348877, Test Loss: 0.17055225372314453\n", "Epoch 9238/10000, Training Loss: 0.015896130353212357, Test Loss: 0.17056013643741608\n", "Epoch 9239/10000, Training Loss: 0.015892349183559418, Test Loss: 0.17055708169937134\n", "Epoch 9240/10000, Training Loss: 0.015888579189777374, Test Loss: 0.17064258456230164\n", "Epoch 9241/10000, Training Loss: 0.015884816646575928, Test Loss: 0.1707664132118225\n", "Epoch 9242/10000, Training Loss: 0.015881028026342392, Test Loss: 0.17091882228851318\n", "Epoch 9243/10000, Training Loss: 0.015877263620495796, Test Loss: 0.17087429761886597\n", "Epoch 9244/10000, Training Loss: 0.015873514115810394, Test Loss: 0.17115198075771332\n", "Epoch 9245/10000, Training Loss: 0.015869738534092903, Test Loss: 0.17101162672042847\n", "Epoch 9246/10000, Training Loss: 0.015865957364439964, Test Loss: 0.17135311663150787\n", "Epoch 9247/10000, Training Loss: 0.015862176194787025, Test Loss: 0.17113232612609863\n", "Epoch 9248/10000, Training Loss: 0.015858476981520653, Test Loss: 0.17153626680374146\n", "Epoch 9249/10000, Training Loss: 0.01585465669631958, Test Loss: 0.1712469458580017\n", "Epoch 9250/10000, Training Loss: 0.015850910916924477, Test Loss: 0.1716979593038559\n", "Epoch 9251/10000, Training Loss: 0.01584714464843273, Test Loss: 0.17136169970035553\n", "Epoch 9252/10000, Training Loss: 0.015843387693166733, Test Loss: 0.17183852195739746\n", "Epoch 9253/10000, Training Loss: 0.015839572995901108, Test Loss: 0.1714961677789688\n", "Epoch 9254/10000, Training Loss: 0.015835754573345184, Test Loss: 0.17191360890865326\n", "Epoch 9255/10000, Training Loss: 0.015831928700208664, Test Loss: 0.17165637016296387\n", "Epoch 9256/10000, Training Loss: 0.015828097239136696, Test Loss: 0.17194780707359314\n", "Epoch 9257/10000, Training Loss: 0.015824295580387115, Test Loss: 0.1718231439590454\n", "Epoch 9258/10000, Training Loss: 0.01582048088312149, Test Loss: 0.17196279764175415\n", "Epoch 9259/10000, Training Loss: 0.015816662460565567, Test Loss: 0.17197951674461365\n", "Epoch 9260/10000, Training Loss: 0.015812860801815987, Test Loss: 0.17197787761688232\n", "Epoch 9261/10000, Training Loss: 0.01580907218158245, Test Loss: 0.17211389541625977\n", "Epoch 9262/10000, Training Loss: 0.01580524817109108, Test Loss: 0.17200417816638947\n", "Epoch 9263/10000, Training Loss: 0.015801532194018364, Test Loss: 0.17222166061401367\n", "Epoch 9264/10000, Training Loss: 0.015797670930624008, Test Loss: 0.17202743887901306\n", "Epoch 9265/10000, Training Loss: 0.015793943777680397, Test Loss: 0.17233450710773468\n", "Epoch 9266/10000, Training Loss: 0.015790151432156563, Test Loss: 0.17198768258094788\n", "Epoch 9267/10000, Training Loss: 0.015786632895469666, Test Loss: 0.17271177470684052\n", "Epoch 9268/10000, Training Loss: 0.015783309936523438, Test Loss: 0.1717759668827057\n", "Epoch 9269/10000, Training Loss: 0.015780624002218246, Test Loss: 0.173284649848938\n", "Epoch 9270/10000, Training Loss: 0.015778254717588425, Test Loss: 0.1715678572654724\n", "Epoch 9271/10000, Training Loss: 0.015775905922055244, Test Loss: 0.1736825406551361\n", "Epoch 9272/10000, Training Loss: 0.015773171558976173, Test Loss: 0.17155182361602783\n", "Epoch 9273/10000, Training Loss: 0.01576993800699711, Test Loss: 0.17381539940834045\n", "Epoch 9274/10000, Training Loss: 0.015765709802508354, Test Loss: 0.1718294769525528\n", "Epoch 9275/10000, Training Loss: 0.015760760754346848, Test Loss: 0.17360687255859375\n", "Epoch 9276/10000, Training Loss: 0.015755116939544678, Test Loss: 0.1724349558353424\n", "Epoch 9277/10000, Training Loss: 0.015749704092741013, Test Loss: 0.17316092550754547\n", "Epoch 9278/10000, Training Loss: 0.015744933858513832, Test Loss: 0.1731419861316681\n", "Epoch 9279/10000, Training Loss: 0.01574105955660343, Test Loss: 0.17273974418640137\n", "Epoch 9280/10000, Training Loss: 0.015737896785140038, Test Loss: 0.173692524433136\n", "Epoch 9281/10000, Training Loss: 0.015734946355223656, Test Loss: 0.17255958914756775\n", "Epoch 9282/10000, Training Loss: 0.015731796622276306, Test Loss: 0.1739417463541031\n", "Epoch 9283/10000, Training Loss: 0.015728164464235306, Test Loss: 0.17270377278327942\n", "Epoch 9284/10000, Training Loss: 0.015724068507552147, Test Loss: 0.1738821119070053\n", "Epoch 9285/10000, Training Loss: 0.015719588845968246, Test Loss: 0.1731109470129013\n", "Epoch 9286/10000, Training Loss: 0.015715153887867928, Test Loss: 0.1736532300710678\n", "Epoch 9287/10000, Training Loss: 0.015710966661572456, Test Loss: 0.17356157302856445\n", "Epoch 9288/10000, Training Loss: 0.015707066282629967, Test Loss: 0.1734326183795929\n", "Epoch 9289/10000, Training Loss: 0.015703542158007622, Test Loss: 0.17386476695537567\n", "Epoch 9290/10000, Training Loss: 0.015699822455644608, Test Loss: 0.1734520047903061\n", "Epoch 9291/10000, Training Loss: 0.015696292743086815, Test Loss: 0.1742350459098816\n", "Epoch 9292/10000, Training Loss: 0.01569286175072193, Test Loss: 0.17348743975162506\n", "Epoch 9293/10000, Training Loss: 0.01568922959268093, Test Loss: 0.17442777752876282\n", "Epoch 9294/10000, Training Loss: 0.015685494989156723, Test Loss: 0.173666313290596\n", "Epoch 9295/10000, Training Loss: 0.015681566670536995, Test Loss: 0.17446090281009674\n", "Epoch 9296/10000, Training Loss: 0.01567758060991764, Test Loss: 0.1739468276500702\n", "Epoch 9297/10000, Training Loss: 0.015673605725169182, Test Loss: 0.17438893020153046\n", "Epoch 9298/10000, Training Loss: 0.01566961035132408, Test Loss: 0.1742895245552063\n", "Epoch 9299/10000, Training Loss: 0.015665756538510323, Test Loss: 0.17429716885089874\n", "Epoch 9300/10000, Training Loss: 0.015662042424082756, Test Loss: 0.17459560930728912\n", "Epoch 9301/10000, Training Loss: 0.015658413991332054, Test Loss: 0.17426982522010803\n", "Epoch 9302/10000, Training Loss: 0.015654703602194786, Test Loss: 0.17482072114944458\n", "Epoch 9303/10000, Training Loss: 0.01565113477408886, Test Loss: 0.17430350184440613\n", "Epoch 9304/10000, Training Loss: 0.015647396445274353, Test Loss: 0.1749567687511444\n", "Epoch 9305/10000, Training Loss: 0.015643592923879623, Test Loss: 0.17440886795520782\n", "Epoch 9306/10000, Training Loss: 0.015639856457710266, Test Loss: 0.1750134825706482\n", "Epoch 9307/10000, Training Loss: 0.01563604362308979, Test Loss: 0.17447783052921295\n", "Epoch 9308/10000, Training Loss: 0.01563245989382267, Test Loss: 0.17525452375411987\n", "Epoch 9309/10000, Training Loss: 0.015628933906555176, Test Loss: 0.17445100843906403\n", "Epoch 9310/10000, Training Loss: 0.01562565192580223, Test Loss: 0.17560561001300812\n", "Epoch 9311/10000, Training Loss: 0.015622331760823727, Test Loss: 0.1744907945394516\n", "Epoch 9312/10000, Training Loss: 0.015618683770298958, Test Loss: 0.17574447393417358\n", "Epoch 9313/10000, Training Loss: 0.015614810399711132, Test Loss: 0.17470736801624298\n", "Epoch 9314/10000, Training Loss: 0.015610761940479279, Test Loss: 0.1757022589445114\n", "Epoch 9315/10000, Training Loss: 0.015606540255248547, Test Loss: 0.17504236102104187\n", "Epoch 9316/10000, Training Loss: 0.0156022934243083, Test Loss: 0.17554938793182373\n", "Epoch 9317/10000, Training Loss: 0.015598257072269917, Test Loss: 0.17543797194957733\n", "Epoch 9318/10000, Training Loss: 0.01559438742697239, Test Loss: 0.17538712918758392\n", "Epoch 9319/10000, Training Loss: 0.015590705908834934, Test Loss: 0.1757805198431015\n", "Epoch 9320/10000, Training Loss: 0.01558707095682621, Test Loss: 0.17531102895736694\n", "Epoch 9321/10000, Training Loss: 0.015583513304591179, Test Loss: 0.17600367963314056\n", "Epoch 9322/10000, Training Loss: 0.015579833649098873, Test Loss: 0.17536255717277527\n", "Epoch 9323/10000, Training Loss: 0.015576168894767761, Test Loss: 0.17609497904777527\n", "Epoch 9324/10000, Training Loss: 0.015572311356663704, Test Loss: 0.1755315512418747\n", "Epoch 9325/10000, Training Loss: 0.01556843426078558, Test Loss: 0.17608484625816345\n", "Epoch 9326/10000, Training Loss: 0.015564516186714172, Test Loss: 0.17568495869636536\n", "Epoch 9327/10000, Training Loss: 0.01556080486625433, Test Loss: 0.176274374127388\n", "Epoch 9328/10000, Training Loss: 0.015557142905890942, Test Loss: 0.1756935566663742\n", "Epoch 9329/10000, Training Loss: 0.01555374450981617, Test Loss: 0.17662295699119568\n", "Epoch 9330/10000, Training Loss: 0.015550347976386547, Test Loss: 0.1757088005542755\n", "Epoch 9331/10000, Training Loss: 0.015546796843409538, Test Loss: 0.17681638896465302\n", "Epoch 9332/10000, Training Loss: 0.01554310042411089, Test Loss: 0.17584718763828278\n", "Epoch 9333/10000, Training Loss: 0.015539204701781273, Test Loss: 0.1768479347229004\n", "Epoch 9334/10000, Training Loss: 0.01553513016551733, Test Loss: 0.17612773180007935\n", "Epoch 9335/10000, Training Loss: 0.015531007200479507, Test Loss: 0.17675846815109253\n", "Epoch 9336/10000, Training Loss: 0.015526976436376572, Test Loss: 0.17647500336170197\n", "Epoch 9337/10000, Training Loss: 0.015522981993854046, Test Loss: 0.17663271725177765\n", "Epoch 9338/10000, Training Loss: 0.015519171953201294, Test Loss: 0.17680566012859344\n", "Epoch 9339/10000, Training Loss: 0.01551550067961216, Test Loss: 0.17654725909233093\n", "Epoch 9340/10000, Training Loss: 0.01551187876611948, Test Loss: 0.17705956101417542\n", "Epoch 9341/10000, Training Loss: 0.015508248470723629, Test Loss: 0.17654567956924438\n", "Epoch 9342/10000, Training Loss: 0.015504498034715652, Test Loss: 0.17722974717617035\n", "Epoch 9343/10000, Training Loss: 0.015501054003834724, Test Loss: 0.17651180922985077\n", "Epoch 9344/10000, Training Loss: 0.015497587621212006, Test Loss: 0.1775536686182022\n", "Epoch 9345/10000, Training Loss: 0.015494409948587418, Test Loss: 0.17641162872314453\n", "Epoch 9346/10000, Training Loss: 0.015491406433284283, Test Loss: 0.1779443323612213\n", "Epoch 9347/10000, Training Loss: 0.015488410368561745, Test Loss: 0.1763303428888321\n", "Epoch 9348/10000, Training Loss: 0.015485755167901516, Test Loss: 0.17829665541648865\n", "Epoch 9349/10000, Training Loss: 0.01548243872821331, Test Loss: 0.17645609378814697\n", "Epoch 9350/10000, Training Loss: 0.015478512272238731, Test Loss: 0.17827720940113068\n", "Epoch 9351/10000, Training Loss: 0.015473622828722, Test Loss: 0.1769305318593979\n", "Epoch 9352/10000, Training Loss: 0.015468518249690533, Test Loss: 0.1779537796974182\n", "Epoch 9353/10000, Training Loss: 0.015463541261851788, Test Loss: 0.17758069932460785\n", "Epoch 9354/10000, Training Loss: 0.015459222719073296, Test Loss: 0.17755557596683502\n", "Epoch 9355/10000, Training Loss: 0.015455622225999832, Test Loss: 0.17815546691417694\n", "Epoch 9356/10000, Training Loss: 0.015452416613698006, Test Loss: 0.17731311917304993\n", "Epoch 9357/10000, Training Loss: 0.015449336729943752, Test Loss: 0.17848284542560577\n", "Epoch 9358/10000, Training Loss: 0.015446001663804054, Test Loss: 0.17734505236148834\n", "Epoch 9359/10000, Training Loss: 0.015442225150763988, Test Loss: 0.17851747572422028\n", "Epoch 9360/10000, Training Loss: 0.015438108704984188, Test Loss: 0.1776384860277176\n", "Epoch 9361/10000, Training Loss: 0.015433858148753643, Test Loss: 0.17833667993545532\n", "Epoch 9362/10000, Training Loss: 0.015429604798555374, Test Loss: 0.17806346714496613\n", "Epoch 9363/10000, Training Loss: 0.015425572171807289, Test Loss: 0.17809903621673584\n", "Epoch 9364/10000, Training Loss: 0.015421816147863865, Test Loss: 0.17845259606838226\n", "Epoch 9365/10000, Training Loss: 0.015418286435306072, Test Loss: 0.17795464396476746\n", "Epoch 9366/10000, Training Loss: 0.01541479118168354, Test Loss: 0.1786929965019226\n", "Epoch 9367/10000, Training Loss: 0.015411270782351494, Test Loss: 0.17789724469184875\n", "Epoch 9368/10000, Training Loss: 0.015407940372824669, Test Loss: 0.17896656692028046\n", "Epoch 9369/10000, Training Loss: 0.01540451217442751, Test Loss: 0.17790262401103973\n", "Epoch 9370/10000, Training Loss: 0.015401210635900497, Test Loss: 0.17919833958148956\n", "Epoch 9371/10000, Training Loss: 0.015397458337247372, Test Loss: 0.1780189424753189\n", "Epoch 9372/10000, Training Loss: 0.015394026413559914, Test Loss: 0.17935171723365784\n", "Epoch 9373/10000, Training Loss: 0.015389966778457165, Test Loss: 0.17833243310451508\n", "Epoch 9374/10000, Training Loss: 0.015385684557259083, Test Loss: 0.17923018336296082\n", "Epoch 9375/10000, Training Loss: 0.015381300821900368, Test Loss: 0.17881561815738678\n", "Epoch 9376/10000, Training Loss: 0.015377177856862545, Test Loss: 0.1789982169866562\n", "Epoch 9377/10000, Training Loss: 0.01537332683801651, Test Loss: 0.17928509414196014\n", "Epoch 9378/10000, Training Loss: 0.015369730070233345, Test Loss: 0.17883510887622833\n", "Epoch 9379/10000, Training Loss: 0.015366320498287678, Test Loss: 0.17959822714328766\n", "Epoch 9380/10000, Training Loss: 0.01536288857460022, Test Loss: 0.17884604632854462\n", "Epoch 9381/10000, Training Loss: 0.015359203331172466, Test Loss: 0.17970526218414307\n", "Epoch 9382/10000, Training Loss: 0.015355418436229229, Test Loss: 0.1790393441915512\n", "Epoch 9383/10000, Training Loss: 0.015351408161222935, Test Loss: 0.1796461045742035\n", "Epoch 9384/10000, Training Loss: 0.015347471460700035, Test Loss: 0.17934103310108185\n", "Epoch 9385/10000, Training Loss: 0.015343522652983665, Test Loss: 0.1795179396867752\n", "Epoch 9386/10000, Training Loss: 0.01533967163413763, Test Loss: 0.1796465367078781\n", "Epoch 9387/10000, Training Loss: 0.015336022712290287, Test Loss: 0.17942173779010773\n", "Epoch 9388/10000, Training Loss: 0.015332401730120182, Test Loss: 0.17987237870693207\n", "Epoch 9389/10000, Training Loss: 0.015328820794820786, Test Loss: 0.1793423444032669\n", "Epoch 9390/10000, Training Loss: 0.015325341373682022, Test Loss: 0.18018975853919983\n", "Epoch 9391/10000, Training Loss: 0.015322032384574413, Test Loss: 0.1792692244052887\n", "Epoch 9392/10000, Training Loss: 0.015318759717047215, Test Loss: 0.1805083155632019\n", "Epoch 9393/10000, Training Loss: 0.015315497294068336, Test Loss: 0.1792764961719513\n", "Epoch 9394/10000, Training Loss: 0.015312188304960728, Test Loss: 0.18075470626354218\n", "Epoch 9395/10000, Training Loss: 0.0153085021302104, Test Loss: 0.17949138581752777\n", "Epoch 9396/10000, Training Loss: 0.01530434750020504, Test Loss: 0.1806948035955429\n", "Epoch 9397/10000, Training Loss: 0.015299898572266102, Test Loss: 0.17993254959583282\n", "Epoch 9398/10000, Training Loss: 0.015295454300940037, Test Loss: 0.18045462667942047\n", "Epoch 9399/10000, Training Loss: 0.015291254036128521, Test Loss: 0.1804334819316864\n", "Epoch 9400/10000, Training Loss: 0.015287536196410656, Test Loss: 0.1802162379026413\n", "Epoch 9401/10000, Training Loss: 0.015284021385014057, Test Loss: 0.1808287352323532\n", "Epoch 9402/10000, Training Loss: 0.01528066024184227, Test Loss: 0.18011736869812012\n", "Epoch 9403/10000, Training Loss: 0.015277158468961716, Test Loss: 0.18102997541427612\n", "Epoch 9404/10000, Training Loss: 0.01527356170117855, Test Loss: 0.18020792305469513\n", "Epoch 9405/10000, Training Loss: 0.01526971161365509, Test Loss: 0.181036576628685\n", "Epoch 9406/10000, Training Loss: 0.015265734866261482, Test Loss: 0.1804535984992981\n", "Epoch 9407/10000, Training Loss: 0.015261742286384106, Test Loss: 0.18091757595539093\n", "Epoch 9408/10000, Training Loss: 0.015257797203958035, Test Loss: 0.18068648874759674\n", "Epoch 9409/10000, Training Loss: 0.01525402907282114, Test Loss: 0.18097613751888275\n", "Epoch 9410/10000, Training Loss: 0.015250291675329208, Test Loss: 0.18078795075416565\n", "Epoch 9411/10000, Training Loss: 0.015246600843966007, Test Loss: 0.18120136857032776\n", "Epoch 9412/10000, Training Loss: 0.015242942608892918, Test Loss: 0.1808646023273468\n", "Epoch 9413/10000, Training Loss: 0.015239288099110126, Test Loss: 0.18133142590522766\n", "Epoch 9414/10000, Training Loss: 0.015235502272844315, Test Loss: 0.18100477755069733\n", "Epoch 9415/10000, Training Loss: 0.015231736935675144, Test Loss: 0.18138353526592255\n", "Epoch 9416/10000, Training Loss: 0.015227992087602615, Test Loss: 0.1811860203742981\n", "Epoch 9417/10000, Training Loss: 0.01522418949753046, Test Loss: 0.18139146268367767\n", "Epoch 9418/10000, Training Loss: 0.015220443718135357, Test Loss: 0.18137434124946594\n", "Epoch 9419/10000, Training Loss: 0.015216679312288761, Test Loss: 0.18139228224754333\n", "Epoch 9420/10000, Training Loss: 0.01521292980760336, Test Loss: 0.18146340548992157\n", "Epoch 9421/10000, Training Loss: 0.01520920917391777, Test Loss: 0.1816168874502182\n", "Epoch 9422/10000, Training Loss: 0.01520549412816763, Test Loss: 0.18140549957752228\n", "Epoch 9423/10000, Training Loss: 0.015201973728835583, Test Loss: 0.18199430406093597\n", "Epoch 9424/10000, Training Loss: 0.015198467299342155, Test Loss: 0.18135018646717072\n", "Epoch 9425/10000, Training Loss: 0.015195009298622608, Test Loss: 0.18223531544208527\n", "Epoch 9426/10000, Training Loss: 0.01519148051738739, Test Loss: 0.18140770494937897\n", "Epoch 9427/10000, Training Loss: 0.015187791548669338, Test Loss: 0.18233953416347504\n", "Epoch 9428/10000, Training Loss: 0.015183981508016586, Test Loss: 0.18157076835632324\n", "Epoch 9429/10000, Training Loss: 0.015180066227912903, Test Loss: 0.18233291804790497\n", "Epoch 9430/10000, Training Loss: 0.015176116488873959, Test Loss: 0.18180392682552338\n", "Epoch 9431/10000, Training Loss: 0.015172149054706097, Test Loss: 0.1822628676891327\n", "Epoch 9432/10000, Training Loss: 0.015168257057666779, Test Loss: 0.18205969035625458\n", "Epoch 9433/10000, Training Loss: 0.01516444981098175, Test Loss: 0.18217870593070984\n", "Epoch 9434/10000, Training Loss: 0.015160747803747654, Test Loss: 0.18221895396709442\n", "Epoch 9435/10000, Training Loss: 0.015156950801610947, Test Loss: 0.18224678933620453\n", "Epoch 9436/10000, Training Loss: 0.015153232961893082, Test Loss: 0.18249648809432983\n", "Epoch 9437/10000, Training Loss: 0.015149539336562157, Test Loss: 0.18225370347499847\n", "Epoch 9438/10000, Training Loss: 0.015145924873650074, Test Loss: 0.182710200548172\n", "Epoch 9439/10000, Training Loss: 0.01514225360006094, Test Loss: 0.18229806423187256\n", "Epoch 9440/10000, Training Loss: 0.015138589777052402, Test Loss: 0.18285870552062988\n", "Epoch 9441/10000, Training Loss: 0.015134841203689575, Test Loss: 0.18238180875778198\n", "Epoch 9442/10000, Training Loss: 0.015131120570003986, Test Loss: 0.18295034766197205\n", "Epoch 9443/10000, Training Loss: 0.01512734591960907, Test Loss: 0.18249697983264923\n", "Epoch 9444/10000, Training Loss: 0.015123618766665459, Test Loss: 0.18299928307533264\n", "Epoch 9445/10000, Training Loss: 0.015119819901883602, Test Loss: 0.1825575828552246\n", "Epoch 9446/10000, Training Loss: 0.01511624176055193, Test Loss: 0.18322117626667023\n", "Epoch 9447/10000, Training Loss: 0.015112681314349174, Test Loss: 0.1825079768896103\n", "Epoch 9448/10000, Training Loss: 0.01510931458324194, Test Loss: 0.18357211351394653\n", "Epoch 9449/10000, Training Loss: 0.01510610431432724, Test Loss: 0.1823876053094864\n", "Epoch 9450/10000, Training Loss: 0.015103335492312908, Test Loss: 0.18399587273597717\n", "Epoch 9451/10000, Training Loss: 0.015100425109267235, Test Loss: 0.18232601881027222\n", "Epoch 9452/10000, Training Loss: 0.01509737316519022, Test Loss: 0.18422305583953857\n", "Epoch 9453/10000, Training Loss: 0.015093776397407055, Test Loss: 0.18244853615760803\n", "Epoch 9454/10000, Training Loss: 0.015089896507561207, Test Loss: 0.18422454595565796\n", "Epoch 9455/10000, Training Loss: 0.015085460618138313, Test Loss: 0.1827724128961563\n", "Epoch 9456/10000, Training Loss: 0.015080807730555534, Test Loss: 0.18402212858200073\n", "Epoch 9457/10000, Training Loss: 0.01507609337568283, Test Loss: 0.18323490023612976\n", "Epoch 9458/10000, Training Loss: 0.015071616508066654, Test Loss: 0.18371926248073578\n", "Epoch 9459/10000, Training Loss: 0.015067466534674168, Test Loss: 0.1837095022201538\n", "Epoch 9460/10000, Training Loss: 0.015063729137182236, Test Loss: 0.18344613909721375\n", "Epoch 9461/10000, Training Loss: 0.015060286968946457, Test Loss: 0.1840839833021164\n", "Epoch 9462/10000, Training Loss: 0.0150568513199687, Test Loss: 0.18329329788684845\n", "Epoch 9463/10000, Training Loss: 0.015053510665893555, Test Loss: 0.18430159986019135\n", "Epoch 9464/10000, Training Loss: 0.015050051733851433, Test Loss: 0.18322643637657166\n", "Epoch 9465/10000, Training Loss: 0.015046875923871994, Test Loss: 0.18453872203826904\n", "Epoch 9466/10000, Training Loss: 0.015043436549603939, Test Loss: 0.18321162462234497\n", "Epoch 9467/10000, Training Loss: 0.01504017785191536, Test Loss: 0.18473994731903076\n", "Epoch 9468/10000, Training Loss: 0.01503649540245533, Test Loss: 0.18329273164272308\n", "Epoch 9469/10000, Training Loss: 0.015033098869025707, Test Loss: 0.18486303091049194\n", "Epoch 9470/10000, Training Loss: 0.015028978697955608, Test Loss: 0.18355917930603027\n", "Epoch 9471/10000, Training Loss: 0.01502471324056387, Test Loss: 0.1847214549779892\n", "Epoch 9472/10000, Training Loss: 0.01502020563930273, Test Loss: 0.18400777876377106\n", "Epoch 9473/10000, Training Loss: 0.015015789307653904, Test Loss: 0.18444189429283142\n", "Epoch 9474/10000, Training Loss: 0.01501172874122858, Test Loss: 0.18448366224765778\n", "Epoch 9475/10000, Training Loss: 0.015008044429123402, Test Loss: 0.1841864287853241\n", "Epoch 9476/10000, Training Loss: 0.015004563145339489, Test Loss: 0.18484613299369812\n", "Epoch 9477/10000, Training Loss: 0.015001242980360985, Test Loss: 0.18406999111175537\n", "Epoch 9478/10000, Training Loss: 0.014997831545770168, Test Loss: 0.18502436578273773\n", "Epoch 9479/10000, Training Loss: 0.014994218945503235, Test Loss: 0.1841321587562561\n", "Epoch 9480/10000, Training Loss: 0.014990417286753654, Test Loss: 0.18502077460289001\n", "Epoch 9481/10000, Training Loss: 0.014986434020102024, Test Loss: 0.18434196710586548\n", "Epoch 9482/10000, Training Loss: 0.014982479624450207, Test Loss: 0.1848939210176468\n", "Epoch 9483/10000, Training Loss: 0.014978591352701187, Test Loss: 0.18455441296100616\n", "Epoch 9484/10000, Training Loss: 0.014974746853113174, Test Loss: 0.18491223454475403\n", "Epoch 9485/10000, Training Loss: 0.014971038326621056, Test Loss: 0.18465958535671234\n", "Epoch 9486/10000, Training Loss: 0.014967354014515877, Test Loss: 0.1850748211145401\n", "Epoch 9487/10000, Training Loss: 0.014963680878281593, Test Loss: 0.18467381596565247\n", "Epoch 9488/10000, Training Loss: 0.014960117638111115, Test Loss: 0.1853393018245697\n", "Epoch 9489/10000, Training Loss: 0.014956566505134106, Test Loss: 0.18471062183380127\n", "Epoch 9490/10000, Training Loss: 0.014952967874705791, Test Loss: 0.18546846508979797\n", "Epoch 9491/10000, Training Loss: 0.014949237927794456, Test Loss: 0.1848473846912384\n", "Epoch 9492/10000, Training Loss: 0.014945408329367638, Test Loss: 0.1854846477508545\n", "Epoch 9493/10000, Training Loss: 0.014941570349037647, Test Loss: 0.18505221605300903\n", "Epoch 9494/10000, Training Loss: 0.01493771281093359, Test Loss: 0.18543455004692078\n", "Epoch 9495/10000, Training Loss: 0.014933854341506958, Test Loss: 0.18527817726135254\n", "Epoch 9496/10000, Training Loss: 0.014930174686014652, Test Loss: 0.1853685975074768\n", "Epoch 9497/10000, Training Loss: 0.014926392585039139, Test Loss: 0.18548345565795898\n", "Epoch 9498/10000, Training Loss: 0.014922679401934147, Test Loss: 0.18532344698905945\n", "Epoch 9499/10000, Training Loss: 0.014919012784957886, Test Loss: 0.18564166128635406\n", "Epoch 9500/10000, Training Loss: 0.014915453270077705, Test Loss: 0.18525369465351105\n", "Epoch 9501/10000, Training Loss: 0.014911853708326817, Test Loss: 0.18585434556007385\n", "Epoch 9502/10000, Training Loss: 0.014908219687640667, Test Loss: 0.18531784415245056\n", "Epoch 9503/10000, Training Loss: 0.0149046266451478, Test Loss: 0.18609872460365295\n", "Epoch 9504/10000, Training Loss: 0.01490109320729971, Test Loss: 0.18538060784339905\n", "Epoch 9505/10000, Training Loss: 0.014897448010742664, Test Loss: 0.18623369932174683\n", "Epoch 9506/10000, Training Loss: 0.014893713407218456, Test Loss: 0.18551242351531982\n", "Epoch 9507/10000, Training Loss: 0.014889952726662159, Test Loss: 0.1862785816192627\n", "Epoch 9508/10000, Training Loss: 0.01488612499088049, Test Loss: 0.18569503724575043\n", "Epoch 9509/10000, Training Loss: 0.014882243238389492, Test Loss: 0.18626122176647186\n", "Epoch 9510/10000, Training Loss: 0.014878371730446815, Test Loss: 0.18590162694454193\n", "Epoch 9511/10000, Training Loss: 0.014874574728310108, Test Loss: 0.18621471524238586\n", "Epoch 9512/10000, Training Loss: 0.014870726503431797, Test Loss: 0.18610385060310364\n", "Epoch 9513/10000, Training Loss: 0.01486700028181076, Test Loss: 0.18616804480552673\n", "Epoch 9514/10000, Training Loss: 0.014863249845802784, Test Loss: 0.18628133833408356\n", "Epoch 9515/10000, Training Loss: 0.014859512448310852, Test Loss: 0.18613792955875397\n", "Epoch 9516/10000, Training Loss: 0.014855852350592613, Test Loss: 0.18642546236515045\n", "Epoch 9517/10000, Training Loss: 0.014852224849164486, Test Loss: 0.18606595695018768\n", "Epoch 9518/10000, Training Loss: 0.014848625287413597, Test Loss: 0.1867138147354126\n", "Epoch 9519/10000, Training Loss: 0.014845374040305614, Test Loss: 0.1859152466058731\n", "Epoch 9520/10000, Training Loss: 0.014842069707810879, Test Loss: 0.18709678947925568\n", "Epoch 9521/10000, Training Loss: 0.01483919657766819, Test Loss: 0.18572738766670227\n", "Epoch 9522/10000, Training Loss: 0.014836514368653297, Test Loss: 0.18751640617847443\n", "Epoch 9523/10000, Training Loss: 0.014833897352218628, Test Loss: 0.185563862323761\n", "Epoch 9524/10000, Training Loss: 0.014831687323749065, Test Loss: 0.18788546323776245\n", "Epoch 9525/10000, Training Loss: 0.014828738756477833, Test Loss: 0.18553410470485687\n", "Epoch 9526/10000, Training Loss: 0.014826035127043724, Test Loss: 0.1880679875612259\n", "Epoch 9527/10000, Training Loss: 0.014821844175457954, Test Loss: 0.18584395945072174\n", "Epoch 9528/10000, Training Loss: 0.014816765673458576, Test Loss: 0.18779565393924713\n", "Epoch 9529/10000, Training Loss: 0.01481079775840044, Test Loss: 0.18653152883052826\n", "Epoch 9530/10000, Training Loss: 0.014805173501372337, Test Loss: 0.18723180890083313\n", "Epoch 9531/10000, Training Loss: 0.014800415374338627, Test Loss: 0.18731096386909485\n", "Epoch 9532/10000, Training Loss: 0.01479678601026535, Test Loss: 0.1867084801197052\n", "Epoch 9533/10000, Training Loss: 0.014793886803090572, Test Loss: 0.1878671497106552\n", "Epoch 9534/10000, Training Loss: 0.01479127537459135, Test Loss: 0.18648897111415863\n", "Epoch 9535/10000, Training Loss: 0.014788292348384857, Test Loss: 0.1880365014076233\n", "Epoch 9536/10000, Training Loss: 0.014784538187086582, Test Loss: 0.18666164577007294\n", "Epoch 9537/10000, Training Loss: 0.014780260622501373, Test Loss: 0.18783508241176605\n", "Epoch 9538/10000, Training Loss: 0.014775615185499191, Test Loss: 0.18711985647678375\n", "Epoch 9539/10000, Training Loss: 0.014771241694688797, Test Loss: 0.18744702637195587\n", "Epoch 9540/10000, Training Loss: 0.014767201617360115, Test Loss: 0.18762245774269104\n", "Epoch 9541/10000, Training Loss: 0.014763681218028069, Test Loss: 0.1871204823255539\n", "Epoch 9542/10000, Training Loss: 0.01476040855050087, Test Loss: 0.18795150518417358\n", "Epoch 9543/10000, Training Loss: 0.0147571275010705, Test Loss: 0.18695999681949615\n", "Epoch 9544/10000, Training Loss: 0.014753987081348896, Test Loss: 0.18817497789859772\n", "Epoch 9545/10000, Training Loss: 0.014750547707080841, Test Loss: 0.1869862675666809\n", "Epoch 9546/10000, Training Loss: 0.014746949076652527, Test Loss: 0.18823811411857605\n", "Epoch 9547/10000, Training Loss: 0.014742975123226643, Test Loss: 0.18719911575317383\n", "Epoch 9548/10000, Training Loss: 0.014739038422703743, Test Loss: 0.1881789267063141\n", "Epoch 9549/10000, Training Loss: 0.0147349052131176, Test Loss: 0.18757621943950653\n", "Epoch 9550/10000, Training Loss: 0.014730710536241531, Test Loss: 0.18794095516204834\n", "Epoch 9551/10000, Training Loss: 0.01472682785242796, Test Loss: 0.18800735473632812\n", "Epoch 9552/10000, Training Loss: 0.014723164029419422, Test Loss: 0.1877175271511078\n", "Epoch 9553/10000, Training Loss: 0.014719700440764427, Test Loss: 0.18831853568553925\n", "Epoch 9554/10000, Training Loss: 0.014716249890625477, Test Loss: 0.1876494437456131\n", "Epoch 9555/10000, Training Loss: 0.01471275370568037, Test Loss: 0.1884286254644394\n", "Epoch 9556/10000, Training Loss: 0.014709031209349632, Test Loss: 0.1877087652683258\n", "Epoch 9557/10000, Training Loss: 0.014705460518598557, Test Loss: 0.18851681053638458\n", "Epoch 9558/10000, Training Loss: 0.01470174826681614, Test Loss: 0.18782402575016022\n", "Epoch 9559/10000, Training Loss: 0.014698062092065811, Test Loss: 0.18858827650547028\n", "Epoch 9560/10000, Training Loss: 0.014694253914058208, Test Loss: 0.1880355179309845\n", "Epoch 9561/10000, Training Loss: 0.014690345153212547, Test Loss: 0.18850548565387726\n", "Epoch 9562/10000, Training Loss: 0.014686476439237595, Test Loss: 0.18832652270793915\n", "Epoch 9563/10000, Training Loss: 0.014682693406939507, Test Loss: 0.18837465345859528\n", "Epoch 9564/10000, Training Loss: 0.01467901561409235, Test Loss: 0.1885913908481598\n", "Epoch 9565/10000, Training Loss: 0.014675435610115528, Test Loss: 0.18829409778118134\n", "Epoch 9566/10000, Training Loss: 0.01467182021588087, Test Loss: 0.18875764310359955\n", "Epoch 9567/10000, Training Loss: 0.014668269082903862, Test Loss: 0.18825332820415497\n", "Epoch 9568/10000, Training Loss: 0.014664780348539352, Test Loss: 0.1889660656452179\n", "Epoch 9569/10000, Training Loss: 0.014661269262433052, Test Loss: 0.18823227286338806\n", "Epoch 9570/10000, Training Loss: 0.014657863415777683, Test Loss: 0.18916331231594086\n", "Epoch 9571/10000, Training Loss: 0.014654272235929966, Test Loss: 0.18826983869075775\n", "Epoch 9572/10000, Training Loss: 0.01465078815817833, Test Loss: 0.18931543827056885\n", "Epoch 9573/10000, Training Loss: 0.01464702095836401, Test Loss: 0.18844130635261536\n", "Epoch 9574/10000, Training Loss: 0.014643155969679356, Test Loss: 0.18926170468330383\n", "Epoch 9575/10000, Training Loss: 0.014639134518802166, Test Loss: 0.18874967098236084\n", "Epoch 9576/10000, Training Loss: 0.014635162428021431, Test Loss: 0.18909814953804016\n", "Epoch 9577/10000, Training Loss: 0.014631354250013828, Test Loss: 0.18908613920211792\n", "Epoch 9578/10000, Training Loss: 0.014627602882683277, Test Loss: 0.1889377385377884\n", "Epoch 9579/10000, Training Loss: 0.014623981900513172, Test Loss: 0.18935221433639526\n", "Epoch 9580/10000, Training Loss: 0.014620468020439148, Test Loss: 0.18886055052280426\n", "Epoch 9581/10000, Training Loss: 0.014616956934332848, Test Loss: 0.1894959956407547\n", "Epoch 9582/10000, Training Loss: 0.014613403007388115, Test Loss: 0.18883921205997467\n", "Epoch 9583/10000, Training Loss: 0.01460984069854021, Test Loss: 0.1896679401397705\n", "Epoch 9584/10000, Training Loss: 0.014606361277401447, Test Loss: 0.18883845210075378\n", "Epoch 9585/10000, Training Loss: 0.014602858573198318, Test Loss: 0.18983305990695953\n", "Epoch 9586/10000, Training Loss: 0.014599260874092579, Test Loss: 0.18888239562511444\n", "Epoch 9587/10000, Training Loss: 0.01459577213972807, Test Loss: 0.18996858596801758\n", "Epoch 9588/10000, Training Loss: 0.014592037536203861, Test Loss: 0.18903961777687073\n", "Epoch 9589/10000, Training Loss: 0.014588162302970886, Test Loss: 0.1899157017469406\n", "Epoch 9590/10000, Training Loss: 0.014584142714738846, Test Loss: 0.18932108581066132\n", "Epoch 9591/10000, Training Loss: 0.014580135233700275, Test Loss: 0.1897566020488739\n", "Epoch 9592/10000, Training Loss: 0.014576290734112263, Test Loss: 0.18963593244552612\n", "Epoch 9593/10000, Training Loss: 0.014572487212717533, Test Loss: 0.18958739936351776\n", "Epoch 9594/10000, Training Loss: 0.014568835496902466, Test Loss: 0.18989863991737366\n", "Epoch 9595/10000, Training Loss: 0.014565272256731987, Test Loss: 0.18942372500896454\n", "Epoch 9596/10000, Training Loss: 0.014561937190592289, Test Loss: 0.19021455943584442\n", "Epoch 9597/10000, Training Loss: 0.014558670111000538, Test Loss: 0.18927733600139618\n", "Epoch 9598/10000, Training Loss: 0.014555483125150204, Test Loss: 0.19050516188144684\n", "Epoch 9599/10000, Training Loss: 0.01455225981771946, Test Loss: 0.1892135888338089\n", "Epoch 9600/10000, Training Loss: 0.014549127779901028, Test Loss: 0.1907099336385727\n", "Epoch 9601/10000, Training Loss: 0.014545504003763199, Test Loss: 0.18927592039108276\n", "Epoch 9602/10000, Training Loss: 0.014542062766849995, Test Loss: 0.19079774618148804\n", "Epoch 9603/10000, Training Loss: 0.014538051560521126, Test Loss: 0.18953099846839905\n", "Epoch 9604/10000, Training Loss: 0.01453383918851614, Test Loss: 0.19063061475753784\n", "Epoch 9605/10000, Training Loss: 0.014529367908835411, Test Loss: 0.1899547576904297\n", "Epoch 9606/10000, Training Loss: 0.014525138773024082, Test Loss: 0.19033801555633545\n", "Epoch 9607/10000, Training Loss: 0.014521191827952862, Test Loss: 0.1903965324163437\n", "Epoch 9608/10000, Training Loss: 0.014517556875944138, Test Loss: 0.1900751143693924\n", "Epoch 9609/10000, Training Loss: 0.014514079317450523, Test Loss: 0.1907237321138382\n", "Epoch 9610/10000, Training Loss: 0.014510715380311012, Test Loss: 0.18994830548763275\n", "Epoch 9611/10000, Training Loss: 0.014507347717881203, Test Loss: 0.19087280333042145\n", "Epoch 9612/10000, Training Loss: 0.01450375746935606, Test Loss: 0.18999244272708893\n", "Epoch 9613/10000, Training Loss: 0.014500039629638195, Test Loss: 0.19084908068180084\n", "Epoch 9614/10000, Training Loss: 0.014496129006147385, Test Loss: 0.19011883437633514\n", "Epoch 9615/10000, Training Loss: 0.014492400921881199, Test Loss: 0.190857395529747\n", "Epoch 9616/10000, Training Loss: 0.014488613232970238, Test Loss: 0.19023804366588593\n", "Epoch 9617/10000, Training Loss: 0.014484956860542297, Test Loss: 0.19091127812862396\n", "Epoch 9618/10000, Training Loss: 0.01448122039437294, Test Loss: 0.1903357207775116\n", "Epoch 9619/10000, Training Loss: 0.014477613382041454, Test Loss: 0.19101373851299286\n", "Epoch 9620/10000, Training Loss: 0.014473899267613888, Test Loss: 0.19046564400196075\n", "Epoch 9621/10000, Training Loss: 0.014470122754573822, Test Loss: 0.1910087615251541\n", "Epoch 9622/10000, Training Loss: 0.014466382563114166, Test Loss: 0.19065594673156738\n", "Epoch 9623/10000, Training Loss: 0.014462580904364586, Test Loss: 0.1909438818693161\n", "Epoch 9624/10000, Training Loss: 0.014458821155130863, Test Loss: 0.19085916876792908\n", "Epoch 9625/10000, Training Loss: 0.014455120079219341, Test Loss: 0.1908697783946991\n", "Epoch 9626/10000, Training Loss: 0.014451483264565468, Test Loss: 0.19103415310382843\n", "Epoch 9627/10000, Training Loss: 0.014447859488427639, Test Loss: 0.19076772034168243\n", "Epoch 9628/10000, Training Loss: 0.01444430835545063, Test Loss: 0.19130544364452362\n", "Epoch 9629/10000, Training Loss: 0.014440914615988731, Test Loss: 0.1906319111585617\n", "Epoch 9630/10000, Training Loss: 0.014437627978622913, Test Loss: 0.19160853326320648\n", "Epoch 9631/10000, Training Loss: 0.01443436648696661, Test Loss: 0.19051802158355713\n", "Epoch 9632/10000, Training Loss: 0.014431262388825417, Test Loss: 0.19188456237316132\n", "Epoch 9633/10000, Training Loss: 0.014428049325942993, Test Loss: 0.19047564268112183\n", "Epoch 9634/10000, Training Loss: 0.014424786902964115, Test Loss: 0.19208404421806335\n", "Epoch 9635/10000, Training Loss: 0.014421269297599792, Test Loss: 0.19054676592350006\n", "Epoch 9636/10000, Training Loss: 0.014417807571589947, Test Loss: 0.19217243790626526\n", "Epoch 9637/10000, Training Loss: 0.014413777738809586, Test Loss: 0.19080224633216858\n", "Epoch 9638/10000, Training Loss: 0.014409461058676243, Test Loss: 0.19201159477233887\n", "Epoch 9639/10000, Training Loss: 0.014405039139091969, Test Loss: 0.19122718274593353\n", "Epoch 9640/10000, Training Loss: 0.014400666579604149, Test Loss: 0.1917179375886917\n", "Epoch 9641/10000, Training Loss: 0.014396635815501213, Test Loss: 0.19168099761009216\n", "Epoch 9642/10000, Training Loss: 0.01439290214329958, Test Loss: 0.1914391815662384\n", "Epoch 9643/10000, Training Loss: 0.01438942737877369, Test Loss: 0.19203488528728485\n", "Epoch 9644/10000, Training Loss: 0.01438616868108511, Test Loss: 0.19128309190273285\n", "Epoch 9645/10000, Training Loss: 0.01438280288130045, Test Loss: 0.19222162663936615\n", "Epoch 9646/10000, Training Loss: 0.014379271306097507, Test Loss: 0.19129015505313873\n", "Epoch 9647/10000, Training Loss: 0.014375573955476284, Test Loss: 0.19223451614379883\n", "Epoch 9648/10000, Training Loss: 0.014371798373758793, Test Loss: 0.19144469499588013\n", "Epoch 9649/10000, Training Loss: 0.01436789520084858, Test Loss: 0.19211798906326294\n", "Epoch 9650/10000, Training Loss: 0.014363975264132023, Test Loss: 0.191630557179451\n", "Epoch 9651/10000, Training Loss: 0.014360171742737293, Test Loss: 0.19208651781082153\n", "Epoch 9652/10000, Training Loss: 0.014356543309986591, Test Loss: 0.19175367057323456\n", "Epoch 9653/10000, Training Loss: 0.014352801255881786, Test Loss: 0.19210000336170197\n", "Epoch 9654/10000, Training Loss: 0.014349028468132019, Test Loss: 0.1919519305229187\n", "Epoch 9655/10000, Training Loss: 0.01434539444744587, Test Loss: 0.19217954576015472\n", "Epoch 9656/10000, Training Loss: 0.014341713860630989, Test Loss: 0.1920928657054901\n", "Epoch 9657/10000, Training Loss: 0.014338059350848198, Test Loss: 0.19222861528396606\n", "Epoch 9658/10000, Training Loss: 0.014334424398839474, Test Loss: 0.1922248899936676\n", "Epoch 9659/10000, Training Loss: 0.014330733567476273, Test Loss: 0.19226451218128204\n", "Epoch 9660/10000, Training Loss: 0.014327049255371094, Test Loss: 0.1923387348651886\n", "Epoch 9661/10000, Training Loss: 0.014323366805911064, Test Loss: 0.19229935109615326\n", "Epoch 9662/10000, Training Loss: 0.014319764450192451, Test Loss: 0.19242961704730988\n", "Epoch 9663/10000, Training Loss: 0.014316079206764698, Test Loss: 0.19233940541744232\n", "Epoch 9664/10000, Training Loss: 0.014312399551272392, Test Loss: 0.19249819219112396\n", "Epoch 9665/10000, Training Loss: 0.014308774843811989, Test Loss: 0.19238698482513428\n", "Epoch 9666/10000, Training Loss: 0.014305096119642258, Test Loss: 0.19249185919761658\n", "Epoch 9667/10000, Training Loss: 0.014301454648375511, Test Loss: 0.19253303110599518\n", "Epoch 9668/10000, Training Loss: 0.014297690242528915, Test Loss: 0.19253186881542206\n", "Epoch 9669/10000, Training Loss: 0.01429409347474575, Test Loss: 0.1927531659603119\n", "Epoch 9670/10000, Training Loss: 0.014290460385382175, Test Loss: 0.19252659380435944\n", "Epoch 9671/10000, Training Loss: 0.014286872930824757, Test Loss: 0.19292309880256653\n", "Epoch 9672/10000, Training Loss: 0.014283274300396442, Test Loss: 0.1925441175699234\n", "Epoch 9673/10000, Training Loss: 0.014279644005000591, Test Loss: 0.19304358959197998\n", "Epoch 9674/10000, Training Loss: 0.014275996014475822, Test Loss: 0.19258460402488708\n", "Epoch 9675/10000, Training Loss: 0.014272329397499561, Test Loss: 0.19312404096126556\n", "Epoch 9676/10000, Training Loss: 0.014268720522522926, Test Loss: 0.19258999824523926\n", "Epoch 9677/10000, Training Loss: 0.01426516193896532, Test Loss: 0.1933157593011856\n", "Epoch 9678/10000, Training Loss: 0.014261697418987751, Test Loss: 0.19252151250839233\n", "Epoch 9679/10000, Training Loss: 0.014258368872106075, Test Loss: 0.19359120726585388\n", "Epoch 9680/10000, Training Loss: 0.014255087822675705, Test Loss: 0.19240300357341766\n", "Epoch 9681/10000, Training Loss: 0.014252154156565666, Test Loss: 0.1939176470041275\n", "Epoch 9682/10000, Training Loss: 0.014249189756810665, Test Loss: 0.19226865470409393\n", "Epoch 9683/10000, Training Loss: 0.014246538281440735, Test Loss: 0.19424492120742798\n", "Epoch 9684/10000, Training Loss: 0.014243615791201591, Test Loss: 0.192231222987175\n", "Epoch 9685/10000, Training Loss: 0.014240408316254616, Test Loss: 0.1943599134683609\n", "Epoch 9686/10000, Training Loss: 0.01423654891550541, Test Loss: 0.1924058049917221\n", "Epoch 9687/10000, Training Loss: 0.014232276007533073, Test Loss: 0.1942283660173416\n", "Epoch 9688/10000, Training Loss: 0.014227484352886677, Test Loss: 0.19279620051383972\n", "Epoch 9689/10000, Training Loss: 0.014222714118659496, Test Loss: 0.19390037655830383\n", "Epoch 9690/10000, Training Loss: 0.01421805378049612, Test Loss: 0.19330111145973206\n", "Epoch 9691/10000, Training Loss: 0.01421377807855606, Test Loss: 0.19351081550121307\n", "Epoch 9692/10000, Training Loss: 0.014209927059710026, Test Loss: 0.1937721073627472\n", "Epoch 9693/10000, Training Loss: 0.014206461608409882, Test Loss: 0.19320084154605865\n", "Epoch 9694/10000, Training Loss: 0.01420324482023716, Test Loss: 0.19409812986850739\n", "Epoch 9695/10000, Training Loss: 0.014200136065483093, Test Loss: 0.193006232380867\n", "Epoch 9696/10000, Training Loss: 0.014197072945535183, Test Loss: 0.19436146318912506\n", "Epoch 9697/10000, Training Loss: 0.014193962328135967, Test Loss: 0.19293534755706787\n", "Epoch 9698/10000, Training Loss: 0.014190662652254105, Test Loss: 0.19450636208057404\n", "Epoch 9699/10000, Training Loss: 0.01418714877218008, Test Loss: 0.1930239051580429\n", "Epoch 9700/10000, Training Loss: 0.014183384366333485, Test Loss: 0.19451376795768738\n", "Epoch 9701/10000, Training Loss: 0.014179371297359467, Test Loss: 0.19326141476631165\n", "Epoch 9702/10000, Training Loss: 0.014175149612128735, Test Loss: 0.19441552460193634\n", "Epoch 9703/10000, Training Loss: 0.014171035960316658, Test Loss: 0.19358393549919128\n", "Epoch 9704/10000, Training Loss: 0.014166935347020626, Test Loss: 0.1942923367023468\n", "Epoch 9705/10000, Training Loss: 0.014162987470626831, Test Loss: 0.19394488632678986\n", "Epoch 9706/10000, Training Loss: 0.014159130863845348, Test Loss: 0.1941036581993103\n", "Epoch 9707/10000, Training Loss: 0.01415533572435379, Test Loss: 0.19428309798240662\n", "Epoch 9708/10000, Training Loss: 0.014151854440569878, Test Loss: 0.19395817816257477\n", "Epoch 9709/10000, Training Loss: 0.014148354530334473, Test Loss: 0.19451752305030823\n", "Epoch 9710/10000, Training Loss: 0.01414489932358265, Test Loss: 0.19391706585884094\n", "Epoch 9711/10000, Training Loss: 0.014141377061605453, Test Loss: 0.1946193426847458\n", "Epoch 9712/10000, Training Loss: 0.014137727208435535, Test Loss: 0.19394119083881378\n", "Epoch 9713/10000, Training Loss: 0.014134193770587444, Test Loss: 0.1947324424982071\n", "Epoch 9714/10000, Training Loss: 0.014130590483546257, Test Loss: 0.19398577511310577\n", "Epoch 9715/10000, Training Loss: 0.014127085916697979, Test Loss: 0.1948436051607132\n", "Epoch 9716/10000, Training Loss: 0.014123464934527874, Test Loss: 0.19405584037303925\n", "Epoch 9717/10000, Training Loss: 0.014119870029389858, Test Loss: 0.19494621455669403\n", "Epoch 9718/10000, Training Loss: 0.014116247184574604, Test Loss: 0.19415028393268585\n", "Epoch 9719/10000, Training Loss: 0.014112656936049461, Test Loss: 0.1950405389070511\n", "Epoch 9720/10000, Training Loss: 0.014108916744589806, Test Loss: 0.1943119317293167\n", "Epoch 9721/10000, Training Loss: 0.014105081558227539, Test Loss: 0.19500431418418884\n", "Epoch 9722/10000, Training Loss: 0.01410130225121975, Test Loss: 0.19454973936080933\n", "Epoch 9723/10000, Training Loss: 0.014097503386437893, Test Loss: 0.19489802420139313\n", "Epoch 9724/10000, Training Loss: 0.01409370731562376, Test Loss: 0.19480273127555847\n", "Epoch 9725/10000, Training Loss: 0.014090034179389477, Test Loss: 0.19478566944599152\n", "Epoch 9726/10000, Training Loss: 0.014086417853832245, Test Loss: 0.19501657783985138\n", "Epoch 9727/10000, Training Loss: 0.014082832261919975, Test Loss: 0.19471348822116852\n", "Epoch 9728/10000, Training Loss: 0.014079298824071884, Test Loss: 0.19516263902187347\n", "Epoch 9729/10000, Training Loss: 0.014075754210352898, Test Loss: 0.19465482234954834\n", "Epoch 9730/10000, Training Loss: 0.014072258956730366, Test Loss: 0.19536106288433075\n", "Epoch 9731/10000, Training Loss: 0.014068868942558765, Test Loss: 0.19458962976932526\n", "Epoch 9732/10000, Training Loss: 0.014065464027225971, Test Loss: 0.19557327032089233\n", "Epoch 9733/10000, Training Loss: 0.014062103815376759, Test Loss: 0.19454804062843323\n", "Epoch 9734/10000, Training Loss: 0.014058738946914673, Test Loss: 0.1957661509513855\n", "Epoch 9735/10000, Training Loss: 0.01405536662787199, Test Loss: 0.19455714523792267\n", "Epoch 9736/10000, Training Loss: 0.014051941223442554, Test Loss: 0.1959138810634613\n", "Epoch 9737/10000, Training Loss: 0.014048401266336441, Test Loss: 0.19463413953781128\n", "Epoch 9738/10000, Training Loss: 0.014044812880456448, Test Loss: 0.19600170850753784\n", "Epoch 9739/10000, Training Loss: 0.014040985144674778, Test Loss: 0.19478340446949005\n", "Epoch 9740/10000, Training Loss: 0.01403732132166624, Test Loss: 0.19603312015533447\n", "Epoch 9741/10000, Training Loss: 0.014033387415111065, Test Loss: 0.19503551721572876\n", "Epoch 9742/10000, Training Loss: 0.014029334299266338, Test Loss: 0.19590945541858673\n", "Epoch 9743/10000, Training Loss: 0.014025247655808926, Test Loss: 0.19537244737148285\n", "Epoch 9744/10000, Training Loss: 0.014021310955286026, Test Loss: 0.19571766257286072\n", "Epoch 9745/10000, Training Loss: 0.014017476700246334, Test Loss: 0.1957082599401474\n", "Epoch 9746/10000, Training Loss: 0.014013848267495632, Test Loss: 0.19554129242897034\n", "Epoch 9747/10000, Training Loss: 0.01401028037071228, Test Loss: 0.19597503542900085\n", "Epoch 9748/10000, Training Loss: 0.014006763696670532, Test Loss: 0.19543904066085815\n", "Epoch 9749/10000, Training Loss: 0.014003343880176544, Test Loss: 0.1961382031440735\n", "Epoch 9750/10000, Training Loss: 0.013999816961586475, Test Loss: 0.19538526237010956\n", "Epoch 9751/10000, Training Loss: 0.013996494933962822, Test Loss: 0.19631759822368622\n", "Epoch 9752/10000, Training Loss: 0.013993069529533386, Test Loss: 0.1953544169664383\n", "Epoch 9753/10000, Training Loss: 0.013989661820232868, Test Loss: 0.19648505747318268\n", "Epoch 9754/10000, Training Loss: 0.013986202888190746, Test Loss: 0.19536663591861725\n", "Epoch 9755/10000, Training Loss: 0.013982743956148624, Test Loss: 0.1966182440519333\n", "Epoch 9756/10000, Training Loss: 0.013979142531752586, Test Loss: 0.19543638825416565\n", "Epoch 9757/10000, Training Loss: 0.013975624926388264, Test Loss: 0.19670619070529938\n", "Epoch 9758/10000, Training Loss: 0.01397187914699316, Test Loss: 0.1955653578042984\n", "Epoch 9759/10000, Training Loss: 0.013968251645565033, Test Loss: 0.19675050675868988\n", "Epoch 9760/10000, Training Loss: 0.013964374549686909, Test Loss: 0.19578653573989868\n", "Epoch 9761/10000, Training Loss: 0.013960370793938637, Test Loss: 0.19664932787418365\n", "Epoch 9762/10000, Training Loss: 0.01395639218389988, Test Loss: 0.1960919052362442\n", "Epoch 9763/10000, Training Loss: 0.01395245548337698, Test Loss: 0.19647760689258575\n", "Epoch 9764/10000, Training Loss: 0.013948643580079079, Test Loss: 0.19640368223190308\n", "Epoch 9765/10000, Training Loss: 0.013945008628070354, Test Loss: 0.19631408154964447\n", "Epoch 9766/10000, Training Loss: 0.013941402547061443, Test Loss: 0.1966584473848343\n", "Epoch 9767/10000, Training Loss: 0.013937932439148426, Test Loss: 0.19621187448501587\n", "Epoch 9768/10000, Training Loss: 0.013934413902461529, Test Loss: 0.19682340323925018\n", "Epoch 9769/10000, Training Loss: 0.013930913992226124, Test Loss: 0.19614657759666443\n", "Epoch 9770/10000, Training Loss: 0.013927505351603031, Test Loss: 0.19701455533504486\n", "Epoch 9771/10000, Training Loss: 0.013924104161560535, Test Loss: 0.1960955113172531\n", "Epoch 9772/10000, Training Loss: 0.013920748606324196, Test Loss: 0.19719982147216797\n", "Epoch 9773/10000, Training Loss: 0.013917345553636551, Test Loss: 0.196083202958107\n", "Epoch 9774/10000, Training Loss: 0.01391399372369051, Test Loss: 0.19735422730445862\n", "Epoch 9775/10000, Training Loss: 0.013910550624132156, Test Loss: 0.19612754881381989\n", "Epoch 9776/10000, Training Loss: 0.013906962238252163, Test Loss: 0.19746074080467224\n", "Epoch 9777/10000, Training Loss: 0.0139033617451787, Test Loss: 0.19623619318008423\n", "Epoch 9778/10000, Training Loss: 0.013899670913815498, Test Loss: 0.19751545786857605\n", "Epoch 9779/10000, Training Loss: 0.013895858079195023, Test Loss: 0.19640345871448517\n", "Epoch 9780/10000, Training Loss: 0.013892143964767456, Test Loss: 0.19753025472164154\n", "Epoch 9781/10000, Training Loss: 0.013888197019696236, Test Loss: 0.19664953649044037\n", "Epoch 9782/10000, Training Loss: 0.013884234242141247, Test Loss: 0.19741640985012054\n", "Epoch 9783/10000, Training Loss: 0.013880293816328049, Test Loss: 0.19696058332920074\n", "Epoch 9784/10000, Training Loss: 0.013876395300030708, Test Loss: 0.19724947214126587\n", "Epoch 9785/10000, Training Loss: 0.013872677460312843, Test Loss: 0.1972627192735672\n", "Epoch 9786/10000, Training Loss: 0.013869015499949455, Test Loss: 0.19710037112236023\n", "Epoch 9787/10000, Training Loss: 0.013865534216165543, Test Loss: 0.19750165939331055\n", "Epoch 9788/10000, Training Loss: 0.01386201847344637, Test Loss: 0.19701503217220306\n", "Epoch 9789/10000, Training Loss: 0.01385855209082365, Test Loss: 0.19765019416809082\n", "Epoch 9790/10000, Training Loss: 0.0138550428673625, Test Loss: 0.1969672590494156\n", "Epoch 9791/10000, Training Loss: 0.01385160256177187, Test Loss: 0.19782346487045288\n", "Epoch 9792/10000, Training Loss: 0.013848209753632545, Test Loss: 0.19693003594875336\n", "Epoch 9793/10000, Training Loss: 0.013844777829945087, Test Loss: 0.19799627363681793\n", "Epoch 9794/10000, Training Loss: 0.013841403648257256, Test Loss: 0.19692280888557434\n", "Epoch 9795/10000, Training Loss: 0.013837974518537521, Test Loss: 0.19814671576023102\n", "Epoch 9796/10000, Training Loss: 0.01383452583104372, Test Loss: 0.1969619244337082\n", "Epoch 9797/10000, Training Loss: 0.013830985873937607, Test Loss: 0.19825927913188934\n", "Epoch 9798/10000, Training Loss: 0.013827451504766941, Test Loss: 0.19705618917942047\n", "Epoch 9799/10000, Training Loss: 0.01382377464324236, Test Loss: 0.19832593202590942\n", "Epoch 9800/10000, Training Loss: 0.013820033520460129, Test Loss: 0.19720453023910522\n", "Epoch 9801/10000, Training Loss: 0.013816309161484241, Test Loss: 0.19835402071475983\n", "Epoch 9802/10000, Training Loss: 0.013812483288347721, Test Loss: 0.19739000499248505\n", "Epoch 9803/10000, Training Loss: 0.013808724470436573, Test Loss: 0.1983642578125\n", "Epoch 9804/10000, Training Loss: 0.013804880902171135, Test Loss: 0.19762809574604034\n", "Epoch 9805/10000, Training Loss: 0.013800962828099728, Test Loss: 0.19827093183994293\n", "Epoch 9806/10000, Training Loss: 0.013797139748930931, Test Loss: 0.19791044294834137\n", "Epoch 9807/10000, Training Loss: 0.013793379068374634, Test Loss: 0.19813792407512665\n", "Epoch 9808/10000, Training Loss: 0.013789687305688858, Test Loss: 0.1981782466173172\n", "Epoch 9809/10000, Training Loss: 0.01378606166690588, Test Loss: 0.19802254438400269\n", "Epoch 9810/10000, Training Loss: 0.013782515190541744, Test Loss: 0.19839009642601013\n", "Epoch 9811/10000, Training Loss: 0.01377900131046772, Test Loss: 0.19795845448970795\n", "Epoch 9812/10000, Training Loss: 0.013775463216006756, Test Loss: 0.19852618873119354\n", "Epoch 9813/10000, Training Loss: 0.01377196330577135, Test Loss: 0.1979171186685562\n", "Epoch 9814/10000, Training Loss: 0.013768533244729042, Test Loss: 0.1986999660730362\n", "Epoch 9815/10000, Training Loss: 0.013765128329396248, Test Loss: 0.19787201285362244\n", "Epoch 9816/10000, Training Loss: 0.013761709444224834, Test Loss: 0.19888587296009064\n", "Epoch 9817/10000, Training Loss: 0.013758331537246704, Test Loss: 0.19784501194953918\n", "Epoch 9818/10000, Training Loss: 0.013754986226558685, Test Loss: 0.19905945658683777\n", "Epoch 9819/10000, Training Loss: 0.013751628808677197, Test Loss: 0.19785469770431519\n", "Epoch 9820/10000, Training Loss: 0.013748140074312687, Test Loss: 0.1991998553276062\n", "Epoch 9821/10000, Training Loss: 0.013744705356657505, Test Loss: 0.19791719317436218\n", "Epoch 9822/10000, Training Loss: 0.013741111382842064, Test Loss: 0.19929352402687073\n", "Epoch 9823/10000, Training Loss: 0.013737449422478676, Test Loss: 0.19803854823112488\n", "Epoch 9824/10000, Training Loss: 0.013733839616179466, Test Loss: 0.19933797419071198\n", "Epoch 9825/10000, Training Loss: 0.013729911297559738, Test Loss: 0.19821248948574066\n", "Epoch 9826/10000, Training Loss: 0.013726236298680305, Test Loss: 0.19934529066085815\n", "Epoch 9827/10000, Training Loss: 0.013722313567996025, Test Loss: 0.1984582543373108\n", "Epoch 9828/10000, Training Loss: 0.013718346133828163, Test Loss: 0.19923332333564758\n", "Epoch 9829/10000, Training Loss: 0.01371441874653101, Test Loss: 0.19876225292682648\n", "Epoch 9830/10000, Training Loss: 0.013710618019104004, Test Loss: 0.19907216727733612\n", "Epoch 9831/10000, Training Loss: 0.013706857338547707, Test Loss: 0.19905894994735718\n", "Epoch 9832/10000, Training Loss: 0.013703227043151855, Test Loss: 0.1989257037639618\n", "Epoch 9833/10000, Training Loss: 0.01369970478117466, Test Loss: 0.1992969810962677\n", "Epoch 9834/10000, Training Loss: 0.013696204870939255, Test Loss: 0.1988370567560196\n", "Epoch 9835/10000, Training Loss: 0.013692755252122879, Test Loss: 0.1994524598121643\n", "Epoch 9836/10000, Training Loss: 0.013689286075532436, Test Loss: 0.19878113269805908\n", "Epoch 9837/10000, Training Loss: 0.013685844838619232, Test Loss: 0.1996317058801651\n", "Epoch 9838/10000, Training Loss: 0.013682520017027855, Test Loss: 0.19873552024364471\n", "Epoch 9839/10000, Training Loss: 0.01367907039821148, Test Loss: 0.19980916380882263\n", "Epoch 9840/10000, Training Loss: 0.013675735332071781, Test Loss: 0.1987190693616867\n", "Epoch 9841/10000, Training Loss: 0.013672283850610256, Test Loss: 0.19996413588523865\n", "Epoch 9842/10000, Training Loss: 0.01366890873759985, Test Loss: 0.19874796271324158\n", "Epoch 9843/10000, Training Loss: 0.01366540789604187, Test Loss: 0.20007950067520142\n", "Epoch 9844/10000, Training Loss: 0.01366194523870945, Test Loss: 0.1988331377506256\n", "Epoch 9845/10000, Training Loss: 0.013658291660249233, Test Loss: 0.2001470923423767\n", "Epoch 9846/10000, Training Loss: 0.013654613867402077, Test Loss: 0.19897456467151642\n", "Epoch 9847/10000, Training Loss: 0.013650838285684586, Test Loss: 0.20017212629318237\n", "Epoch 9848/10000, Training Loss: 0.013647004961967468, Test Loss: 0.1991572380065918\n", "Epoch 9849/10000, Training Loss: 0.013643262907862663, Test Loss: 0.2001742571592331\n", "Epoch 9850/10000, Training Loss: 0.01363941840827465, Test Loss: 0.19935600459575653\n", "Epoch 9851/10000, Training Loss: 0.013635714538395405, Test Loss: 0.20017854869365692\n", "Epoch 9852/10000, Training Loss: 0.013631903566420078, Test Loss: 0.19958294928073883\n", "Epoch 9853/10000, Training Loss: 0.01362812239676714, Test Loss: 0.2001039832830429\n", "Epoch 9854/10000, Training Loss: 0.01362443808466196, Test Loss: 0.19978874921798706\n", "Epoch 9855/10000, Training Loss: 0.013620743528008461, Test Loss: 0.20012782514095306\n", "Epoch 9856/10000, Training Loss: 0.01361714955419302, Test Loss: 0.19990739226341248\n", "Epoch 9857/10000, Training Loss: 0.013613575138151646, Test Loss: 0.20024944841861725\n", "Epoch 9858/10000, Training Loss: 0.013610025867819786, Test Loss: 0.1999472826719284\n", "Epoch 9859/10000, Training Loss: 0.013606464490294456, Test Loss: 0.20044836401939392\n", "Epoch 9860/10000, Training Loss: 0.013602996245026588, Test Loss: 0.19989420473575592\n", "Epoch 9861/10000, Training Loss: 0.013599710538983345, Test Loss: 0.2007600963115692\n", "Epoch 9862/10000, Training Loss: 0.01359640434384346, Test Loss: 0.19991187751293182\n", "Epoch 9863/10000, Training Loss: 0.013593028299510479, Test Loss: 0.20097334682941437\n", "Epoch 9864/10000, Training Loss: 0.013589476235210896, Test Loss: 0.20004941523075104\n", "Epoch 9865/10000, Training Loss: 0.013585904613137245, Test Loss: 0.20109407603740692\n", "Epoch 9866/10000, Training Loss: 0.013582148589193821, Test Loss: 0.20028813183307648\n", "Epoch 9867/10000, Training Loss: 0.013578462414443493, Test Loss: 0.20118527114391327\n", "Epoch 9868/10000, Training Loss: 0.013574722222983837, Test Loss: 0.200439915060997\n", "Epoch 9869/10000, Training Loss: 0.013571292161941528, Test Loss: 0.20142391324043274\n", "Epoch 9870/10000, Training Loss: 0.013567895628511906, Test Loss: 0.20051813125610352\n", "Epoch 9871/10000, Training Loss: 0.013564443215727806, Test Loss: 0.20166465640068054\n", "Epoch 9872/10000, Training Loss: 0.013561057858169079, Test Loss: 0.20058763027191162\n", "Epoch 9873/10000, Training Loss: 0.01355776283890009, Test Loss: 0.20188890397548676\n", "Epoch 9874/10000, Training Loss: 0.013554255478084087, Test Loss: 0.20063069462776184\n", "Epoch 9875/10000, Training Loss: 0.013551167212426662, Test Loss: 0.2021813690662384\n", "Epoch 9876/10000, Training Loss: 0.013547910377383232, Test Loss: 0.20069478452205658\n", "Epoch 9877/10000, Training Loss: 0.013544604182243347, Test Loss: 0.20236608386039734\n", "Epoch 9878/10000, Training Loss: 0.013541008345782757, Test Loss: 0.2008562833070755\n", "Epoch 9879/10000, Training Loss: 0.013537352904677391, Test Loss: 0.20243145525455475\n", "Epoch 9880/10000, Training Loss: 0.013533382676541805, Test Loss: 0.20111708343029022\n", "Epoch 9881/10000, Training Loss: 0.013529480434954166, Test Loss: 0.20239782333374023\n", "Epoch 9882/10000, Training Loss: 0.013525360263884068, Test Loss: 0.2014407068490982\n", "Epoch 9883/10000, Training Loss: 0.013521404005587101, Test Loss: 0.20231743156909943\n", "Epoch 9884/10000, Training Loss: 0.013517435640096664, Test Loss: 0.20176538825035095\n", "Epoch 9885/10000, Training Loss: 0.013513665646314621, Test Loss: 0.20225679874420166\n", "Epoch 9886/10000, Training Loss: 0.013509934768080711, Test Loss: 0.20207037031650543\n", "Epoch 9887/10000, Training Loss: 0.013506256975233555, Test Loss: 0.20216487348079681\n", "Epoch 9888/10000, Training Loss: 0.013502653688192368, Test Loss: 0.20233815908432007\n", "Epoch 9889/10000, Training Loss: 0.013499143533408642, Test Loss: 0.20209883153438568\n", "Epoch 9890/10000, Training Loss: 0.013495656661689281, Test Loss: 0.20253270864486694\n", "Epoch 9891/10000, Training Loss: 0.013492163270711899, Test Loss: 0.20200558006763458\n", "Epoch 9892/10000, Training Loss: 0.0134888906031847, Test Loss: 0.20286041498184204\n", "Epoch 9893/10000, Training Loss: 0.013485755771398544, Test Loss: 0.20185743272304535\n", "Epoch 9894/10000, Training Loss: 0.013482863083481789, Test Loss: 0.20324282348155975\n", "Epoch 9895/10000, Training Loss: 0.013480019755661488, Test Loss: 0.201725035905838\n", "Epoch 9896/10000, Training Loss: 0.013477249071002007, Test Loss: 0.2035975456237793\n", "Epoch 9897/10000, Training Loss: 0.013474513776600361, Test Loss: 0.20168985426425934\n", "Epoch 9898/10000, Training Loss: 0.013471621088683605, Test Loss: 0.20383743941783905\n", "Epoch 9899/10000, Training Loss: 0.013468144461512566, Test Loss: 0.20183254778385162\n", "Epoch 9900/10000, Training Loss: 0.013464453630149364, Test Loss: 0.20389309525489807\n", "Epoch 9901/10000, Training Loss: 0.013460265472531319, Test Loss: 0.20218680799007416\n", "Epoch 9902/10000, Training Loss: 0.013455853797495365, Test Loss: 0.2037392109632492\n", "Epoch 9903/10000, Training Loss: 0.013451077044010162, Test Loss: 0.20281882584095\n", "Epoch 9904/10000, Training Loss: 0.013446426019072533, Test Loss: 0.20337755978107452\n", "Epoch 9905/10000, Training Loss: 0.013442284427583218, Test Loss: 0.20352381467819214\n", "Epoch 9906/10000, Training Loss: 0.013438856229186058, Test Loss: 0.2030710130929947\n", "Epoch 9907/10000, Training Loss: 0.013435780070722103, Test Loss: 0.20409660041332245\n", "Epoch 9908/10000, Training Loss: 0.013432995416224003, Test Loss: 0.20291289687156677\n", "Epoch 9909/10000, Training Loss: 0.013430147431790829, Test Loss: 0.2044266015291214\n", "Epoch 9910/10000, Training Loss: 0.013426879420876503, Test Loss: 0.2029683142900467\n", "Epoch 9911/10000, Training Loss: 0.013423695228993893, Test Loss: 0.2045818269252777\n", "Epoch 9912/10000, Training Loss: 0.013419866561889648, Test Loss: 0.20325258374214172\n", "Epoch 9913/10000, Training Loss: 0.013415826484560966, Test Loss: 0.20448562502861023\n", "Epoch 9914/10000, Training Loss: 0.013411497697234154, Test Loss: 0.20370826125144958\n", "Epoch 9915/10000, Training Loss: 0.013407441787421703, Test Loss: 0.20428583025932312\n", "Epoch 9916/10000, Training Loss: 0.013403483666479588, Test Loss: 0.20420017838478088\n", "Epoch 9917/10000, Training Loss: 0.013399842195212841, Test Loss: 0.20404912531375885\n", "Epoch 9918/10000, Training Loss: 0.01339646801352501, Test Loss: 0.2045920342206955\n", "Epoch 9919/10000, Training Loss: 0.013393192552030087, Test Loss: 0.2039017677307129\n", "Epoch 9920/10000, Training Loss: 0.013390040025115013, Test Loss: 0.20486992597579956\n", "Epoch 9921/10000, Training Loss: 0.013386775739490986, Test Loss: 0.20390187203884125\n", "Epoch 9922/10000, Training Loss: 0.013383348472416401, Test Loss: 0.2049790918827057\n", "Epoch 9923/10000, Training Loss: 0.013379758223891258, Test Loss: 0.2040644884109497\n", "Epoch 9924/10000, Training Loss: 0.01337601337581873, Test Loss: 0.20494511723518372\n", "Epoch 9925/10000, Training Loss: 0.013372207060456276, Test Loss: 0.2043299674987793\n", "Epoch 9926/10000, Training Loss: 0.013368386775255203, Test Loss: 0.20485268533229828\n", "Epoch 9927/10000, Training Loss: 0.0133646996691823, Test Loss: 0.2045610100030899\n", "Epoch 9928/10000, Training Loss: 0.013361098244786263, Test Loss: 0.20486515760421753\n", "Epoch 9929/10000, Training Loss: 0.013357507064938545, Test Loss: 0.20476488769054413\n", "Epoch 9930/10000, Training Loss: 0.013353993184864521, Test Loss: 0.20497001707553864\n", "Epoch 9931/10000, Training Loss: 0.013350383378565311, Test Loss: 0.20493143796920776\n", "Epoch 9932/10000, Training Loss: 0.013346881605684757, Test Loss: 0.20520350337028503\n", "Epoch 9933/10000, Training Loss: 0.01334338542073965, Test Loss: 0.2049543708562851\n", "Epoch 9934/10000, Training Loss: 0.01333998329937458, Test Loss: 0.20555207133293152\n", "Epoch 9935/10000, Training Loss: 0.01333667617291212, Test Loss: 0.20495109260082245\n", "Epoch 9936/10000, Training Loss: 0.01333333645015955, Test Loss: 0.20582011342048645\n", "Epoch 9937/10000, Training Loss: 0.013329938054084778, Test Loss: 0.20501871407032013\n", "Epoch 9938/10000, Training Loss: 0.013326559215784073, Test Loss: 0.20598354935646057\n", "Epoch 9939/10000, Training Loss: 0.013323077000677586, Test Loss: 0.20512723922729492\n", "Epoch 9940/10000, Training Loss: 0.013319624587893486, Test Loss: 0.2061588317155838\n", "Epoch 9941/10000, Training Loss: 0.013316084630787373, Test Loss: 0.20524349808692932\n", "Epoch 9942/10000, Training Loss: 0.0133126899600029, Test Loss: 0.20633195340633392\n", "Epoch 9943/10000, Training Loss: 0.01330912858247757, Test Loss: 0.20541733503341675\n", "Epoch 9944/10000, Training Loss: 0.013305523432791233, Test Loss: 0.20638930797576904\n", "Epoch 9945/10000, Training Loss: 0.01330177579075098, Test Loss: 0.2056622952222824\n", "Epoch 9946/10000, Training Loss: 0.01329805888235569, Test Loss: 0.20638106763362885\n", "Epoch 9947/10000, Training Loss: 0.013294273987412453, Test Loss: 0.20595727860927582\n", "Epoch 9948/10000, Training Loss: 0.013290565460920334, Test Loss: 0.20627814531326294\n", "Epoch 9949/10000, Training Loss: 0.01328689232468605, Test Loss: 0.20625969767570496\n", "Epoch 9950/10000, Training Loss: 0.01328333094716072, Test Loss: 0.20616838335990906\n", "Epoch 9951/10000, Training Loss: 0.013279850594699383, Test Loss: 0.206498384475708\n", "Epoch 9952/10000, Training Loss: 0.013276479206979275, Test Loss: 0.20604126155376434\n", "Epoch 9953/10000, Training Loss: 0.013273173943161964, Test Loss: 0.20683197677135468\n", "Epoch 9954/10000, Training Loss: 0.01327007170766592, Test Loss: 0.20589487254619598\n", "Epoch 9955/10000, Training Loss: 0.013267037458717823, Test Loss: 0.20717217028141022\n", "Epoch 9956/10000, Training Loss: 0.013264104723930359, Test Loss: 0.20580695569515228\n", "Epoch 9957/10000, Training Loss: 0.013261030428111553, Test Loss: 0.20740725100040436\n", "Epoch 9958/10000, Training Loss: 0.013257727958261967, Test Loss: 0.20593561232089996\n", "Epoch 9959/10000, Training Loss: 0.013254102319478989, Test Loss: 0.20747345685958862\n", "Epoch 9960/10000, Training Loss: 0.01325012743473053, Test Loss: 0.20626220107078552\n", "Epoch 9961/10000, Training Loss: 0.013246131129562855, Test Loss: 0.20741577446460724\n", "Epoch 9962/10000, Training Loss: 0.013242010027170181, Test Loss: 0.2067001909017563\n", "Epoch 9963/10000, Training Loss: 0.013237991370260715, Test Loss: 0.20736470818519592\n", "Epoch 9964/10000, Training Loss: 0.013234241865575314, Test Loss: 0.20704472064971924\n", "Epoch 9965/10000, Training Loss: 0.013230596669018269, Test Loss: 0.20743325352668762\n", "Epoch 9966/10000, Training Loss: 0.013227041810750961, Test Loss: 0.20728784799575806\n", "Epoch 9967/10000, Training Loss: 0.013223501853644848, Test Loss: 0.20753619074821472\n", "Epoch 9968/10000, Training Loss: 0.013219946064054966, Test Loss: 0.20748937129974365\n", "Epoch 9969/10000, Training Loss: 0.013216431252658367, Test Loss: 0.20759528875350952\n", "Epoch 9970/10000, Training Loss: 0.013212879188358784, Test Loss: 0.20766682922840118\n", "Epoch 9971/10000, Training Loss: 0.013209376484155655, Test Loss: 0.20764027535915375\n", "Epoch 9972/10000, Training Loss: 0.013205834664404392, Test Loss: 0.20780374109745026\n", "Epoch 9973/10000, Training Loss: 0.01320233941078186, Test Loss: 0.2076190710067749\n", "Epoch 9974/10000, Training Loss: 0.013198895379900932, Test Loss: 0.2080555409193039\n", "Epoch 9975/10000, Training Loss: 0.013195488601922989, Test Loss: 0.2075984627008438\n", "Epoch 9976/10000, Training Loss: 0.013192137703299522, Test Loss: 0.20836736261844635\n", "Epoch 9977/10000, Training Loss: 0.013188931159675121, Test Loss: 0.20752546191215515\n", "Epoch 9978/10000, Training Loss: 0.013185800984501839, Test Loss: 0.20871025323867798\n", "Epoch 9979/10000, Training Loss: 0.013182763010263443, Test Loss: 0.20746314525604248\n", "Epoch 9980/10000, Training Loss: 0.013179725967347622, Test Loss: 0.20902210474014282\n", "Epoch 9981/10000, Training Loss: 0.013176740147173405, Test Loss: 0.2074691504240036\n", "Epoch 9982/10000, Training Loss: 0.013173629529774189, Test Loss: 0.20924657583236694\n", "Epoch 9983/10000, Training Loss: 0.013170260936021805, Test Loss: 0.20759229362010956\n", "Epoch 9984/10000, Training Loss: 0.01316668651998043, Test Loss: 0.2093474566936493\n", "Epoch 9985/10000, Training Loss: 0.013162910006940365, Test Loss: 0.20784810185432434\n", "Epoch 9986/10000, Training Loss: 0.013158965855836868, Test Loss: 0.20930013060569763\n", "Epoch 9987/10000, Training Loss: 0.013154713436961174, Test Loss: 0.20828600227832794\n", "Epoch 9988/10000, Training Loss: 0.013150542974472046, Test Loss: 0.2091582715511322\n", "Epoch 9989/10000, Training Loss: 0.013146470300853252, Test Loss: 0.20881354808807373\n", "Epoch 9990/10000, Training Loss: 0.013142584823071957, Test Loss: 0.20895619690418243\n", "Epoch 9991/10000, Training Loss: 0.013139081187546253, Test Loss: 0.20931106805801392\n", "Epoch 9992/10000, Training Loss: 0.013135758228600025, Test Loss: 0.20883862674236298\n", "Epoch 9993/10000, Training Loss: 0.013132529333233833, Test Loss: 0.20969898998737335\n", "Epoch 9994/10000, Training Loss: 0.013129380531609058, Test Loss: 0.20879273116588593\n", "Epoch 9995/10000, Training Loss: 0.013126195408403873, Test Loss: 0.20996317267417908\n", "Epoch 9996/10000, Training Loss: 0.013122894801199436, Test Loss: 0.2088640034198761\n", "Epoch 9997/10000, Training Loss: 0.013119506649672985, Test Loss: 0.21008266508579254\n", "Epoch 9998/10000, Training Loss: 0.01311581116169691, Test Loss: 0.20902292430400848\n", "Epoch 9999/10000, Training Loss: 0.013112322427332401, Test Loss: 0.21016502380371094\n", "Epoch 10000/10000, Training Loss: 0.013108654879033566, Test Loss: 0.2092178910970688\n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAHHCAYAAABDUnkqAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAABtHElEQVR4nO3dd3gU1f4G8Hf7bnqvBELvQgAJoQhoMBS5giiIXAk2LghYEBVUmi02FBUEsYAFBOGHWOhEuF4QpSNKr6Gkh/S6u+f3xyRDloQQkk0m2byf59lnd2fO7H53ErIvZ86cUQkhBIiIiIgchFrpAoiIiIjsieGGiIiIHArDDRERETkUhhsiIiJyKAw3RERE5FAYboiIiMihMNwQERGRQ2G4ISIiIofCcENEREQOheGG6r1x48YhNDS0StvOmTMHKpXKvgXVMefPn4dKpcKyZcuULuWmli1bBpVKhfPnzytdCjUgJb93+/btU7oUshOGG6oxKpWqUrcdO3YoXWqDFxoaWqmflb0C0ptvvol169bZ5bXspSTopqSkKF1Kpfzyyy8YOHAgvL29YTQa0apVK0ybNg2pqalKl1ZGSXi40e2PP/5QukRyMFqlCyDH9c0339g8//rrr7F169Yyy9u2bVut9/nss89gtVqrtO0rr7yC6dOnV+v9HcH8+fORnZ0tP9+wYQO+++47fPDBB/Dx8ZGX9+zZ0y7v9+abb+L+++/HsGHDbJY//PDDePDBB2EwGOzyPo5q2rRpmDdvHjp16oQXX3wRXl5eOHDgABYsWICVK1ciNjYWrVu3VrrMMl599VU0bdq0zPIWLVooUA05MoYbqjH//ve/bZ7/8ccf2Lp1a5nl18vNzYWTk1Ol30en01WpPgDQarXQavnP4PqQkZCQgO+++w7Dhg2r8iG/qtBoNNBoNLX2fvXRd999h3nz5mHUqFFYvny5zf4aN24c+vfvjwceeAAHDhyo1d/tnJwcODs7V9hm0KBB6NatWy1VRA0ZD0uRovr164cOHTpg//79uOOOO+Dk5ISXXnoJAPDjjz9iyJAhCAoKgsFgQPPmzfHaa6/BYrHYvMb1Y25Kxpi89957WLJkCZo3bw6DwYDbb78de/futdm2vDE3KpUKkydPxrp169ChQwcYDAa0b98emzZtKlP/jh070K1bNxiNRjRv3hyffvpppcfx/O9//8MDDzyAxo0bw2AwICQkBM8++yzy8vLKfD4XFxdcvnwZw4YNg4uLC3x9fTFt2rQy+yI9PR3jxo2Du7s7PDw8EB0djfT09JvWUlnffvstunbtCpPJBC8vLzz44IO4ePGiTZtTp05hxIgRCAgIgNFoRKNGjfDggw8iIyMDgLR/c3Jy8NVXX8mHJcaNGweg/DE3oaGhuOeee7Bz5050794dRqMRzZo1w9dff12mvr/++gt9+/aFyWRCo0aN8Prrr2Pp0qV2Hcfz66+/ok+fPnB2doaHhwfuvfdeHDt2zKZNVlYWnnnmGYSGhsJgMMDPzw8DBgzAgQMHKr2fbmTu3Lnw9PTEkiVLygTB7t2748UXX8SRI0ewZs0aAMDkyZPh4uKC3NzcMq81evRoBAQE2Pwebdy4Uf58rq6uGDJkCP755x+b7Up+J8+cOYPBgwfD1dUVY8aMqdwOrEDpf7sffPABmjRpApPJhL59++Lvv/8u074yPwsAuHz5Mh577DH5b0nTpk0xceJEFBYW2rQrKCjA1KlT4evrC2dnZwwfPhzJyck2bfbt24eoqCj4+PjAZDKhadOmePTRR6v92cm++F9WUlxqaioGDRqEBx98EP/+97/h7+8PQPqic3FxwdSpU+Hi4oJff/0Vs2bNQmZmJt59992bvu6KFSuQlZWF//znP1CpVHjnnXdw33334ezZszft7dm5cyfWrl2LJ598Eq6urvjoo48wYsQIxMXFwdvbGwBw8OBBDBw4EIGBgZg7dy4sFgteffVV+Pr6Vupzr169Grm5uZg4cSK8vb2xZ88efPzxx7h06RJWr15t09ZisSAqKgrh4eF47733sG3bNsybNw/NmzfHxIkTAQBCCNx7773YuXMnJkyYgLZt2+KHH35AdHR0peq5mTfeeAMzZ87EyJEj8fjjjyM5ORkff/wx7rjjDhw8eBAeHh4oLCxEVFQUCgoKMGXKFAQEBODy5cv45ZdfkJ6eDnd3d3zzzTd4/PHH0b17d4wfPx4A0Lx58wrf+/Tp07j//vvx2GOPITo6Gl9++SXGjRuHrl27on379gCkL7D+/ftDpVJhxowZcHZ2xueff27XQ1zbtm3DoEGD0KxZM8yZMwd5eXn4+OOP0atXLxw4cEAO2RMmTMCaNWswefJktGvXDqmpqdi5cyeOHTuGLl26VGo/lefUqVM4ceIExo0bBzc3t3LbjB07FrNnz8Yvv/yCBx98EKNGjcLChQuxfv16PPDAA3K73Nxc/Pzzzxg3bpwckr755htER0cjKioKb7/9NnJzc7Fo0SL07t0bBw8etPlPhNlsRlRUFHr37o333nuvUr2tGRkZZcY0qVQq+d9Uia+//hpZWVmYNGkS8vPz8eGHH+LOO+/EkSNH5L8Plf1ZXLlyBd27d0d6ejrGjx+PNm3a4PLly1izZg1yc3Oh1+vl950yZQo8PT0xe/ZsnD9/HvPnz8fkyZOxatUqAEBSUhLuvvtu+Pr6Yvr06fDw8MD58+exdu3am352qmWCqJZMmjRJXP8r17dvXwFALF68uEz73NzcMsv+85//CCcnJ5Gfny8vi46OFk2aNJGfnzt3TgAQ3t7eIi0tTV7+448/CgDi559/lpfNnj27TE0AhF6vF6dPn5aXHT58WAAQH3/8sbxs6NChwsnJSVy+fFledurUKaHVasu8ZnnK+3wxMTFCpVKJCxcu2Hw+AOLVV1+1aRsWFia6du0qP1+3bp0AIN555x15mdlsFn369BEAxNKlS29aU4l3331XABDnzp0TQghx/vx5odFoxBtvvGHT7siRI0Kr1crLDx48KACI1atXV/j6zs7OIjo6uszypUuX2ryvEEI0adJEABC//fabvCwpKUkYDAbx3HPPycumTJkiVCqVOHjwoLwsNTVVeHl5lXnN8pT8LiQnJ9+wTefOnYWfn59ITU2Vlx0+fFio1WoxduxYeZm7u7uYNGnSDV+nsvvpeiU/4w8++KDCdm5ubqJLly5CCCGsVqsIDg4WI0aMsGnz/fff2+zXrKws4eHhIZ544gmbdgkJCcLd3d1mecnv5PTp0ytVd8nPtbybwWCQ25X82zWZTOLSpUvy8j///FMAEM8++6y8rLI/i7Fjxwq1Wi327t1bpi6r1WpTX2RkpLxMCCGeffZZodFoRHp6uhBCiB9++EEAKPe1qG7hYSlSnMFgwCOPPFJmuclkkh9nZWUhJSUFffr0QW5uLo4fP37T1x01ahQ8PT3l53369AEAnD179qbbRkZG2vQm3HbbbXBzc5O3tVgs2LZtG4YNG4agoCC5XYsWLTBo0KCbvj5g+/lycnKQkpKCnj17QgiBgwcPlmk/YcIEm+d9+vSx+SwbNmyAVquVe3IAaQzLlClTKlVPRdauXQur1YqRI0ciJSVFvgUEBKBly5bYvn07AMg9Dps3by73MEhVtWvXTv75AYCvry9at25t8/k3bdqEiIgIdO7cWV7m5eVll8MlABAfH49Dhw5h3Lhx8PLykpffdtttGDBgADZs2CAv8/DwwJ9//okrV66U+1pV3U9ZWVkAAFdX1wrbubq6IjMzE4DUM/LAAw9gw4YNNoPGV61aheDgYPTu3RsAsHXrVqSnp2P06NE2P2ONRoPw8HD5Z1xa6d+1yli4cCG2bt1qc9u4cWOZdsOGDUNwcLD8vHv37ggPD5f3cWV/FlarFevWrcPQoUPLHetz/eHj8ePH2yzr06cPLBYLLly4AED6uQLSmWpFRUW39NmpdjHckOKCg4NtuoZL/PPPPxg+fDjc3d3h5uYGX19feTDyzcYlAEDjxo1tnpcEnatXr97ytiXbl2yblJSEvLy8cs/yqOyZH3FxcfIf55JxNH379gVQ9vMZjcYyh7tK1wMAFy5cQGBgIFxcXGza2eOsmVOnTkEIgZYtW8LX19fmduzYMSQlJQEAmjZtiqlTp+Lzzz+Hj48PoqKisHDhwkr9vCpys58HIH3+6vw8bqbkC668/dm2bVukpKQgJycHAPDOO+/g77//RkhICLp37445c+bYBLGq7qeSUFMScm4kKyvLJgCNGjUKeXl5+OmnnwAA2dnZ2LBhAx544AH5y/zUqVMAgDvvvLPMz3jLli3yz7iEVqtFo0aNKqzjet27d0dkZKTNrX///mXatWzZssyyVq1ayeOmKvuzSE5ORmZmJjp06FCp+m72N6Nv374YMWIE5s6dCx8fH9x7771YunQpCgoKKvX6VHs45oYUV7oHo0R6ejr69u0LNzc3vPrqq2jevDmMRiMOHDiAF198sVKnft/orBshRI1uWxkWiwUDBgxAWloaXnzxRbRp0wbOzs64fPkyxo0bV+bzKX0GkdVqhUqlwsaNG8utpXSgmjdvHsaNG4cff/wRW7ZswVNPPYWYmBj88ccft/xlWKKmfx72NnLkSPTp0wc//PADtmzZgnfffRdvv/021q5dK/fsVWU/lUyb8Ndff93wvS9cuIDMzEy0a9dOXtajRw+Ehobi+++/x0MPPYSff/4ZeXl5GDVqlNym5Hfum2++QUBAQJnXvf7MK4PBALXasf5/fLPfM5VKhTVr1uCPP/7Azz//jM2bN+PRRx/FvHnz8Mcff5T5jwUph+GG6qQdO3YgNTUVa9euxR133CEvP3funIJVXePn5wej0YjTp0+XWVfesusdOXIEJ0+exFdffYWxY8fKy7du3Vrlmpo0aYLY2FhkZ2fb/JE9ceJElV+zRPPmzSGEQNOmTdGqVaubtu/YsSM6duyIV155Bb///jt69eqFxYsX4/XXXwdQ9nCAPTRp0qTKP4/Kvj5Q/v48fvw4fHx8bE6FDgwMxJNPPoknn3wSSUlJ6NKlC9544w2bw5Y320/Xa9WqFVq1aoV169bhww8/LPfwVMlZZPfcc4/N8pEjR+LDDz9EZmYmVq1ahdDQUPTo0UNeX3IY1s/PD5GRkZXdLTWipBeptJMnT8qDhCv7szCZTHBzcyv3TKvq6NGjB3r06IE33ngDK1aswJgxY7By5Uo8/vjjdn0fqjrHit3kMEr+B1X6f+aFhYX45JNPlCrJhkajQWRkJNatW2czruL06dPljiEob3vA9vMJIfDhhx9WuabBgwfDbDZj0aJF8jKLxYKPP/64yq9Z4r777oNGo8HcuXPL9JYIIeRZcTMzM2E2m23Wd+zYEWq12qbr3tnZ2a6nqANAVFQUdu/ejUOHDsnL0tLSsHz5cru8fmBgIDp37oyvvvrKpva///4bW7ZsweDBgwFI+/z6w0t+fn4ICgqS90Fl91N5Zs2ahatXr2LChAllpgLYv38/3n77bXTo0AEjRoywWTdq1CgUFBTgq6++wqZNmzBy5Eib9VFRUXBzc8Obb75Z7niS60+Jrknr1q3D5cuX5ed79uzBn3/+KQfDyv4s1Go1hg0bhp9//rncSyvcas/f1atXy2xTMsaLh6bqFvbcUJ3Us2dPeHp6Ijo6Gk899RRUKhW++eabOnUYYs6cOdiyZQt69eqFiRMnwmKxYMGCBejQoYPNF2x52rRpg+bNm2PatGm4fPky3Nzc8H//93+VGg90I0OHDkWvXr0wffp0nD9/Hu3atcPatWurPd4FkP5X//rrr2PGjBk4f/48hg0bBldXV5w7dw4//PADxo8fj2nTpuHXX3/F5MmT8cADD6BVq1Ywm8345ptvoNFobL5su3btim3btuH9999HUFAQmjZtivDw8GrV+MILL+Dbb7/FgAEDMGXKFPlU8MaNGyMtLa3SvUXvv/9+mdOa1Wo1XnrpJbz77rsYNGgQIiIi8Nhjj8mnH7u7u2POnDkApPEujRo1wv33349OnTrBxcUF27Ztw969ezFv3jwAqPR+Ks+YMWOwd+9efPjhhzh69CjGjBkDT09PHDhwAF9++SW8vb2xZs2aMtMddOnSBS1atMDLL7+MgoICm0NSAODm5oZFixbh4YcfRpcuXfDggw/C19cXcXFxWL9+PXr16oUFCxZUah/eyMaNG8s9GaBnz55o1qyZ/LxFixbo3bs3Jk6ciIKCAsyfPx/e3t544YUX5DaV+VkA0mzYW7ZsQd++fTF+/Hi0bdsW8fHxWL16NXbu3CkPEq6Mr776Cp988gmGDx+O5s2bIysrC5999hnc3NzkQEV1hAJnaFEDdaNTwdu3b19u+127dokePXoIk8kkgoKCxAsvvCA2b94sAIjt27fL7W50Kvi7775b5jUBiNmzZ8vPb3QqeHmn8TZp0qTM6cuxsbEiLCxM6PV60bx5c/H555+L5557ThiNxhvshWuOHj0qIiMjhYuLi/Dx8RFPPPGEfMp56dO2o6OjhbOzc5nty6s9NTVVPPzww8LNzU24u7uLhx9+WD7tuDqngpf4v//7P9G7d2/h7OwsnJ2dRZs2bcSkSZPEiRMnhBBCnD17Vjz66KOiefPmwmg0Ci8vL9G/f3+xbds2m9c5fvy4uOOOO4TJZBIA5P16o1PBhwwZUqbGvn37ir59+9osO3jwoOjTp48wGAyiUaNGIiYmRnz00UcCgEhISKjwM5fsz/JuGo1Gbrdt2zbRq1cvYTKZhJubmxg6dKg4evSovL6goEA8//zzolOnTsLV1VU4OzuLTp06iU8++URuU9n9VJF169aJAQMGCE9PT2EwGESLFi3Ec889V+Gp7C+//LIAIFq0aHHDNtu3bxdRUVHC3d1dGI1G0bx5czFu3Dixb98+uc2NfidvpKJTwUv/bpb+tztv3jwREhIiDAaD6NOnjzh8+HCZ173Zz6LEhQsXxNixY4Wvr68wGAyiWbNmYtKkSaKgoMCmvutP8d6+fbvN35sDBw6I0aNHi8aNGwuDwSD8/PzEPffcY7NvqG5QCVGH/itM5ACGDRuGf/75p9xxA1T7nnnmGXz66afIzs5WfGA2Vez8+fNo2rQp3n33XUybNk3pcqge45gbomq4/lIJp06dwoYNG9CvXz9lCmrgrv95pKam4ptvvkHv3r0ZbIgaEI65IaqGZs2aYdy4cWjWrBkuXLiARYsWQa/X24wNoNoTERGBfv36oW3btkhMTMQXX3yBzMxMzJw5U+nSiKgWMdwQVcPAgQPx3XffISEhAQaDAREREXjzzTfLnYSMat7gwYOxZs0aLFmyBCqVCl26dMEXX3xhM50AETk+jrkhIiIih8IxN0RERORQGG6IiIjIoTS4MTdWqxVXrlyBq6trjUwBT0RERPYnhEBWVhaCgoJuel2zBhdurly5gpCQEKXLICIioiq4ePHiTS/C2+DCTcmF5i5evAg3NzeFqyEiIqLKyMzMREhISLkXjL1egws3JYei3NzcGG6IiIjqmcoMKeGAYiIiInIoDDdERETkUBhuiIiIyKE0uDE3RESkDKvVisLCQqXLoDpMr9ff9DTvymC4ISKiGldYWIhz587BarUqXQrVYWq1Gk2bNoVer6/W6zDcEBFRjRJCID4+HhqNBiEhIXb5nzk5npJJduPj49G4ceNqTbSraLj57bff8O6772L//v2Ij4/HDz/8gGHDhlVq2127dqFv377o0KEDDh06VKN1EhFR1ZnNZuTm5iIoKAhOTk5Kl0N1mK+vL65cuQKz2QydTlfl11E0Pufk5KBTp05YuHDhLW2Xnp6OsWPH4q677qqhyoiIyF4sFgsAVPtQAzm+kt+Rkt+ZqlK052bQoEEYNGjQLW83YcIEPPTQQ9BoNFi3bp39CyMiIrvj9fzoZuz1O1LvDnwuXboUZ8+exezZs5UuhYiIiOqgehVuTp06henTp+Pbb7+FVlu5TqeCggJkZmba3IiIiJQQGhqK+fPnV7r9jh07oFKpkJ6eXmM1OaJ6E24sFgseeughzJ07F61atar0djExMXB3d5dvvCI4ERHdjEqlqvA2Z86cKr3u3r17MX78+Eq379mzJ+Lj4+Hu7l6l96ssRwtR9eZU8KysLOzbtw8HDx7E5MmTAUinjQkhoNVqsWXLFtx5551ltpsxYwamTp0qPy+5qqjdmQuA7CRApQLcK74UOxER1W3x8fHy41WrVmHWrFk4ceKEvMzFxUV+LISAxWKp1BEFX1/fW6pDr9cjICDglrahetRz4+bmhiNHjuDQoUPybcKECWjdujUOHTqE8PDwcrczGAzyFcBr9ErgVw4C8zsAXw2tmdcnIqJaExAQIN/c3d2hUqnk58ePH4erqys2btyIrl27wmAwYOfOnThz5gzuvfde+Pv7w8XFBbfffju2bdtm87rXH5ZSqVT4/PPPMXz4cDg5OaFly5b46aef5PXX96gsW7YMHh4e2Lx5M9q2bQsXFxcMHDjQJoyZzWY89dRT8PDwgLe3N1588UVER0dXeqqV8ly9ehVjx46Fp6cnnJycMGjQIJw6dUpef+HCBQwdOhSenp5wdnZG+/btsWHDBnnbMWPGwNfXFyaTCS1btsTSpUurXEtlKBpusrOz5aACAOfOncOhQ4cQFxcHQOp1GTt2LABp1sIOHTrY3Pz8/GA0GtGhQwc4Ozsr9TEAAGfTCgAAKRk5itZBRFTXCSGQW2hW5CaEsNvnmD59Ot566y0cO3YMt912G7KzszF48GDExsbi4MGDGDhwIIYOHSp/p93I3LlzMXLkSPz1118YPHgwxowZg7S0tBu2z83NxXvvvYdvvvkGv/32G+Li4jBt2jR5/dtvv43ly5dj6dKl2LVrFzIzM6t9ZvG4ceOwb98+/PTTT9i9ezeEEBg8eDCKiooAAJMmTUJBQQF+++03HDlyBG+//bbcuzVz5kwcPXoUGzduxLFjx7Bo0SL4+PhUq56bUfSw1L59+9C/f3/5ecnho+joaCxbtgzx8fE3/aWoKwqsUk60Ws0KV0JEVLflFVnQbtZmRd776KtRcNLb56vv1VdfxYABA+TnXl5e6NSpk/z8tddeww8//ICffvpJHk5RnnHjxmH06NEAgDfffBMfffQR9uzZg4EDB5bbvqioCIsXL0bz5s0BAJMnT8arr74qr//4448xY8YMDB8+HACwYMECuRelKk6dOoWffvoJu3btQs+ePQEAy5cvR0hICNatW4cHHngAcXFxGDFiBDp27AgAaNasmbx9XFwcwsLC0K1bNwBS71VNUzTc9OvXr8IUvWzZsgq3nzNnTpUHddmbRivNpKgVDDdERA1ByZd1iezsbMyZMwfr169HfHw8zGYz8vLybvqf9Ntuu01+7OzsDDc3NyQlJd2wvZOTkxxsACAwMFBun5GRgcTERHTv3l1er9Fo0LVr1ypf1+vYsWPQarU2wz+8vb3RunVrHDt2DADw1FNPYeLEidiyZQsiIyMxYsQI+XNNnDgRI0aMwIEDB3D33Xdj2LBhckiqKfVmQHFdpyoONxpUb1ZFIiJHZ9JpcPTVKMXe216uHw4xbdo0bN26Fe+99x5atGgBk8mE+++//6ZXQr/+MgMqlarCIFJee3sebquKxx9/HFFRUVi/fj22bNmCmJgYzJs3D1OmTMGgQYNw4cIFbNiwAVu3bsVdd92FSZMm4b333quxeurNgOK6TqNhuCEiqgyVSgUnvVaRW03Okrxr1y6MGzcOw4cPR8eOHREQEIDz58/X2PuVx93dHf7+/ti7d6+8zGKx4MCBA1V+zbZt28JsNuPPP/+Ul6WmpuLEiRNo166dvCwkJAQTJkzA2rVr8dxzz+Gzzz6T1/n6+iI6Ohrffvst5s+fjyVLllS5nspgz42dqEsOS4GHpYiIGqKWLVti7dq1GDp0KFQqFWbOnFnlQ0HVMWXKFMTExKBFixZo06YNPv74Y1y9erVSwe7IkSNwdXWVn6tUKnTq1An33nsvnnjiCXz66adwdXXF9OnTERwcjHvvvRcA8Mwzz2DQoEFo1aoVrl69iu3bt6Nt27YAgFmzZqFr165o3749CgoK8Msvv8jragrDjZ2oNdLFvrTsuSEiapDef/99PProo+jZsyd8fHzw4osvKjIr/osvvoiEhASMHTsWGo0G48ePR1RUFDSamx+Su+OOO2yeazQamM1mLF26FE8//TTuueceFBYW4o477sCGDRvkQ2QWiwWTJk3CpUuX4ObmhoEDB+KDDz4AIM3VM2PGDJw/fx4mkwl9+vTBypUr7f/BS1EJpQ/U1bLMzEy4u7sjIyPDrnPeXL4Uh+DPpVHimJ0uTeZHRETIz8/HuXPn0LRpUxiNRqXLaXCsVivatm2LkSNH4rXXXlO6nApV9LtyK9/f7Lmxk5LDUgAAqxnQ6G7cmIiIqIZcuHABW7ZsQd++fVFQUIAFCxbg3LlzeOihh5QurdZwQLGdaLT6a08sRcoVQkREDZparcayZctw++23o1evXjhy5Ai2bdtW4+Nc6hL23NiJutQ1RYS1CDwoRURESggJCcGuXbuULkNR7LmxE22pnhurmWdMERERKYXhxk40Wi2sQuqvMZsLFK6GiIio4WK4sRONWgVz8e60mjnmhoiISCkMN3YihRtp3I2Z4YaIiEgxDDd2olWrYYY0QZJguCEiIlIMw42dqFVAUXG4sTDcEBERKYbhxk5UKhUsxeHGynluiIjoFsyZMwedO3dWugyHwXBjR2a556biy9sTEVHdplKpKrzNmTOnWq+9bt06m2XTpk1DbGxs9YquhIYSojiJnx2VDCgWnOeGiKhei4+Plx+vWrUKs2bNwokTJ+RlLi4udn0/FxcXu79mQ8aeGzuyqKTdaeFhKSKiei0gIEC+ubu7Q6VS2SxbuXIl2rZtC6PRiDZt2uCTTz6Rty0sLMTkyZMRGBgIo9GIJk2aICYmBgAQGhoKABg+fDhUKpX8/PoelXHjxmHYsGF47733EBgYCG9vb0yaNAlFRde+X+Lj4zFkyBCYTCY0bdoUK1asQGhoKObPn1/lz33kyBHceeedMJlM8Pb2xvjx45GdnS2v37FjB7p37w5nZ2d4eHigV69euHDhAgDg8OHD6N+/P1xdXeHm5oauXbti3759Va6lOthzY0cWueeGh6WIiG5ICKAoV5n31jkBqupdIGf58uWYNWsWFixYgLCwMBw8eBBPPPEEnJ2dER0djY8++gg//fQTvv/+ezRu3BgXL17ExYsXAQB79+6Fn58fli5dioEDB0Kj0dzwfbZv347AwEBs374dp0+fxqhRo9C5c2c88cQTAICxY8ciJSUFO3bsgE6nw9SpU5GUlFTlz5WTk4OoqChERERg7969SEpKwuOPP47Jkydj2bJlMJvNGDZsGJ544gl89913KCwsxJ49e6Aq3p9jxoxBWFgYFi1aBI1Gg0OHDkGnU+Yi0gw3dnRtQDHDDRHRDRXlAm8GKfPeL10B9M7VeonZs2dj3rx5uO+++wAATZs2xdGjR/Hpp58iOjoacXFxaNmyJXr37g2VSoUmTZrI2/r6+gIAPDw8EBAQUOH7eHp6YsGCBdBoNGjTpg2GDBmC2NhYPPHEEzh+/Di2bduGvXv3olu3bgCAzz//HC1btqzy51qxYgXy8/Px9ddfw9lZ2kcLFizA0KFD8fbbb0On0yEjIwP33HMPmjdvDgA2F+OMi4vD888/jzZt2gBAtWqpLh6WsiOLqiTccMwNEZEjysnJwZkzZ/DYY4/J42RcXFzw+uuv48yZMwCkQ0qHDh1C69at8dRTT2HLli1Veq/27dvb9OwEBgbKPTMnTpyAVqtFly5d5PUtWrSAp6dnlT/bsWPH0KlTJznYAECvXr1gtVpx4sQJeHl5Ydy4cYiKisLQoUPx4Ycf2oxNmjp1Kh5//HFERkbirbfekveHEthzY0fXDktxzA0R0Q3pnKQeFKXeuxpKxp989tlnCA8Pt1lXEkS6dOmCc+fOYePGjdi2bRtGjhyJyMhIrFmz5tZKve6QjkqlgtVqrUb11bd06VI89dRT2LRpE1atWoVXXnkFW7duRY8ePTBnzhw89NBDWL9+PTZu3IjZs2dj5cqVGD58eK3XyXBjRxaVFhCc54aIqEIqVbUPDSnF398fQUFBOHv2LMaMGXPDdm5ubhg1ahRGjRqF+++/HwMHDkRaWhq8vLyg0+lgsViqVUfr1q1hNptx8OBBdO3aFQBw+vRpXL16tcqv2bZtWyxbtgw5OTly782uXbugVqvRunVruV1YWBjCwsIwY8YMREREYMWKFejRowcAoFWrVmjVqhWeffZZjB49GkuXLmW4qe+sxYelBA9LERE5rLlz5+Kpp56Cu7s7Bg4ciIKCAuzbtw9Xr17F1KlT8f777yMwMBBhYWFQq9VYvXo1AgIC4OHhAUA6Yyo2Nha9evWCwWCo0qGkNm3aIDIyEuPHj8eiRYug0+nw3HPPwWQyyQN8byQvLw+HDh2yWebq6ooxY8Zg9uzZiI6Oxpw5c5CcnIwpU6bg4Ycfhr+/P86dO4clS5bgX//6F4KCgnDixAmcOnUKY8eORV5eHp5//nncf//9aNq0KS5duoS9e/dixIgRt/zZ7IHhxo7kAcU8LEVE5LAef/xxODk54d1338Xzzz8PZ2dndOzYEc888wwAKSi88847OHXqFDQaDW6//XZs2LABarU0zHXevHmYOnUqPvvsMwQHB+P8+fNVquPrr7/GY489hjvuuAMBAQGIiYnBP//8A6PRWOF2J0+eRFhYmM2yu+66C9u2bcPmzZvx9NNP4/bbb4eTkxNGjBiB999/HwDg5OSE48eP46uvvkJqaioCAwMxadIk/Oc//4HZbEZqairGjh2LxMRE+Pj44L777sPcuXOr9NmqSyWEEIq8s0IyMzPh7u6OjIwMuLm52fW1/3g9Ej3Me3GqRwxaDnzSrq9NRFRf5efn49y5c2jatOlNv3ip6i5duoSQkBBs27YNd911l9LlVElFvyu38v3Nnhs74mEpIiKqLb/++iuys7PRsWNHxMfH44UXXkBoaCjuuOMOpUtTHMONHVlVxWdLcZ4bIiKqYUVFRXjppZdw9uxZuLq6omfPnli+fLliE+fVJQw3dmSRww17boiIqGZFRUUhKipK6TLqJE7iZ0dWtZSWVey5ISIiUgzDjR2ZVXrpgaVA2UKIiOqgBnb+ClWBvX5HGG7syKyWwo3KnK9wJUREdUfJzL2FhezVpoqV/I5UdEHRyuCYGzsyqw0AAJWZPTdERCW0Wi2cnJyQnJwMnU4nz/dCVJrVakVycjKcnJyg1VYvnjDc2JGlpOfGynBDRFRCpVIhMDAQ586dw4ULF5Quh+owtVqNxo0b33SW5ZthuLGjknCjZs8NEZENvV6Pli1b8tAUVUiv19ulZ4/hxo4sJYelOKCYiKgMtVrNGYqpVvDApx3JPTcMN0RERIphuLEji0bquVFb2e1KRESkFEXDzW+//YahQ4ciKCgIKpUK69atq7D92rVrMWDAAPj6+sLNzQ0RERHYvHlz7RRbCdbiw1Ia9twQEREpRtFwk5OTg06dOmHhwoWVav/bb79hwIAB2LBhA/bv34/+/ftj6NChOHjwYA1XWjlWTfFhKZ4tRUREpBhFBxQPGjQIgwYNqnT7+fPn2zx/88038eOPP+Lnn39GWFiYnau7dVYNe26IiIiUVq/H3FitVmRlZcHLy0vpUgAAQiOdBcAxN0RERMqp16eCv/fee8jOzsbIkSNv2KagoAAFBdd6UjIzM2uuIK0UbjQ8LEVERKSYettzs2LFCsydOxfff/89/Pz8btguJiYG7u7u8i0kJKTmitIWH5Zizw0REZFi6mW4WblyJR5//HF8//33iIyMrLDtjBkzkJGRId8uXrxYY3WpdVLPjZbhhoiISDH17rDUd999h0cffRQrV67EkCFDbtreYDDAYDDUQmWASlsSbnhYioiISCmKhpvs7GycPn1afn7u3DkcOnQIXl5eaNy4MWbMmIHLly/j66+/BiAdioqOjsaHH36I8PBwJCQkAABMJhPc3d0V+Qw29CYAgE4UAlYrwCvfEhER1TpFv3337duHsLAw+TTuqVOnIiwsDLNmzQIAxMfHIy4uTm6/ZMkSmM1mTJo0CYGBgfLt6aefVqT+66n0zteeFOUoVwgREVEDpmjPTb9+/SCEuOH6ZcuW2TzfsWNHzRZUTWq9EyxCBY1KAIU5gMFV6ZKIiIgaHB43sSO9Vo0cFF/xtiBb2WKIiIgaKIYbO9Jp1MiBNO4GhVnKFkNERNRAMdzYkV6rRo5gzw0REZGSGG7sSKdRI7vksFQhww0REZESGG7sSK9RI0cUH5Zizw0REZEiGG7syGZAMXtuiIiIFMFwY0fSYamSAcUMN0REREpguLEjnUbFAcVEREQKY7ixI722VM9NQaayxRARETVQDDd2pNeokSmKL8GQn6FsMURERA0Uw40d6TRqZIDhhoiISEkMN3ak16qRKZykJ3npitZCRETUUDHc2JFee63nRuSnK1sMERFRA8VwY0cmnUbuuRE8LEVERKQIhhs7Muo0yCwZc8PDUkRERIpguLEjjVqFPI0rAEBdmAVYLQpXRERE1PAw3NiZWed67QkPTREREdU6hhs70+sNyBEG6QnDDRERUa1juLEzk77UuBueMUVERFTrGG7szKTXIIOzFBMRESmG4cbOnHRaZIIT+RERESmF4cbOjOy5ISIiUhTDjZ056TTXem445oaIiKjWMdzYmZNewyuDExERKYjhxs5szpbimBsiIqJax3BjZx5OumtXBmfPDRERUa1juLEzTye9fGVwjrkhIiKqfQw3duZuYs8NERGRkhhu7MzDSY8M4SI9YbghIiKqdQw3dubhpOMkfkRERApiuLEzTycdJ/EjIiJSEMONnbmb9Nd6biwFQFGesgURERE1MAw3duZu0iEbJliESlrA3hsiIqJaxXBjZ3qtGq5GHSfyIyIiUgjDTQ3wdtbzdHAiIiKFMNzUAG8XAyfyIyIiUgjDTQ3wYs8NERGRYhhuaoC3c6lLMHDMDRERUa1SNNz89ttvGDp0KIKCgqBSqbBu3bqbbrNjxw506dIFBoMBLVq0wLJly2q8zlvl7aJHJue6ISIiUoSi4SYnJwedOnXCwoULK9X+3LlzGDJkCPr3749Dhw7hmWeeweOPP47NmzfXcKW3xsuZY26IiIiUolXyzQcNGoRBgwZVuv3ixYvRtGlTzJs3DwDQtm1b7Ny5Ex988AGioqJqqsxb5uOixwl5zE26orUQERE1NPVqzM3u3bsRGRlpsywqKgq7d+++4TYFBQXIzMy0udU0L2c957khIiJSSL0KNwkJCfD397dZ5u/vj8zMTOTllX+Zg5iYGLi7u8u3kJCQGq9TOluKY26IiIiUUK/CTVXMmDEDGRkZ8u3ixYs1/p4+pea5EQw3REREtUrRMTe3KiAgAImJiTbLEhMT4ebmBpPJVO42BoMBBoOhNsqTeTpdm+fGmpcOTa2+OxERUcNWr3puIiIiEBsba7Ns69atiIiIUKii8um1algMbtITjrkhIiKqVYqGm+zsbBw6dAiHDh0CIJ3qfejQIcTFxQGQDimNHTtWbj9hwgScPXsWL7zwAo4fP45PPvkE33//PZ599lklyq+Q2tkbAKApzAQsZoWrISIiajgUDTf79u1DWFgYwsLCAABTp05FWFgYZs2aBQCIj4+Xgw4ANG3aFOvXr8fWrVvRqVMnzJs3D59//nmdOg28hN7ZC1ahkp7kpSlbDBERUQOi6Jibfv36QQhxw/XlzT7cr18/HDx4sAarsg8PFxPSE53hhWwgNxVw8VO6JCIiogahXo25qU+8XfS4KlylJzkpyhZDRETUgDDc1BBvZwNSUTyoODdV2WKIiIgaEIabGuLlXKrnJpc9N0RERLWF4aaGeLvokSqHGw4oJiIiqi0MNzXE29mAq+CYGyIiotrGcFNDvF30SBMcc0NERFTbGG5qiK+rAWnFh6Ws7LkhIiKqNQw3NcTLSY8MldRzY8lmuCEiIqotDDc1RK1WAU7SJRgEe26IiIhqDcNNDdK4+gIAtPmpQAUzMRMREZH9MNzUIKO7FG7U1iKgMFvhaoiIiBoGhpsa5OnhiTyhl57w0BQREVGtYLipQf5uRqSVzHXD08GJiIhqBcNNDfJzNSC1ZK4b9twQERHVCoabGuTvZiwVbpKVLYaIiKiBYLipQf5uRqTCXXrCi2cSERHVCoabGuTvZpAvnmnOSlK4GiIiooaB4aYGuZt0SFd5AAAKMhKVLYaIiKiBYLipQSqVCmaTFwDAnMUxN0RERLWB4aamOflI9zxbioiIqFYw3NQwjasfAECbx3BDRERUGxhuapjBw1+6L7zK60sRERHVAoabGubmFQAA0IoioCBT4WqIiIgcH8NNDfP18kS2MEpPOO6GiIioxjHc1LBAdyMvwUBERFSLGG5qWICbEamQwo3I4UR+RERENY3hpoaVvr5U7lVO5EdERFTTGG5qmF6rRo7WEwCQnZagcDVERESOj+GmFhQavQHwEgxERES1geGmNpikcGPhxTOJiIhqHMNNLVC7+kr3eakKV0JEROT4GG5qgcFdmshPl89wQ0REVNMYbmqBc/EsxU5FVxWuhIiIyPEx3NQCd59AAICrNQOwWhWuhoiIyLEx3NQCHz8p3Ghghchj7w0REVFNYripBf6ebkgXzgCArLR4hashIiJybAw3tcCo0yBd5Q4AuJp0WeFqiIiIHBvDTS3J1noAALI4SzEREVGNUjzcLFy4EKGhoTAajQgPD8eePXsqbD9//ny0bt0aJpMJISEhePbZZ5Gfn19L1VZdgd4LAJDH60sRERHVKEXDzapVqzB16lTMnj0bBw4cQKdOnRAVFYWkpPJn8l2xYgWmT5+O2bNn49ixY/jiiy+watUqvPTSS7Vc+a0zm3yk+0yGGyIiopqkaLh5//338cQTT+CRRx5Bu3btsHjxYjg5OeHLL78st/3vv/+OXr164aGHHkJoaCjuvvtujB49+qa9PXWB2kWapVjkpChcCRERkWNTLNwUFhZi//79iIyMvFaMWo3IyEjs3r273G169uyJ/fv3y2Hm7Nmz2LBhAwYPHnzD9ykoKEBmZqbNTQm64kswaPMZboiIiGqSVqk3TklJgcVigb+/v81yf39/HD9+vNxtHnroIaSkpKB3794QQsBsNmPChAkVHpaKiYnB3Llz7Vp7VZg8pVmKDYWc54aIiKgmKT6g+Fbs2LEDb775Jj755BMcOHAAa9euxfr16/Haa6/dcJsZM2YgIyNDvl28eLEWK77G1VuayM/FfBVCCEVqICIiaggU67nx8fGBRqNBYqLtANvExEQEBASUu83MmTPx8MMP4/HHHwcAdOzYETk5ORg/fjxefvllqNVls5rBYIDBYLD/B7hFXr7B0j0ykVVghptRp3BFREREjkmxnhu9Xo+uXbsiNjZWXma1WhEbG4uIiIhyt8nNzS0TYDQaDQDU+d4Qo4cU2DxV2UhIy1K4GiIiIselWM8NAEydOhXR0dHo1q0bunfvjvnz5yMnJwePPPIIAGDs2LEIDg5GTEwMAGDo0KF4//33ERYWhvDwcJw+fRozZ87E0KFD5ZBTZ5k8YYEaGliRmhwPBHkpXREREZFDUjTcjBo1CsnJyZg1axYSEhLQuXNnbNq0SR5kHBcXZ9NT88orr0ClUuGVV17B5cuX4evri6FDh+KNN95Q6iNUnlqNbLUb3K3pyEy5AqC90hURERE5JJWo68dz7CwzMxPu7u7IyMiAm5tbrb53wlthCMg/i//rsAAj7n+4Vt+biIioPruV7+96dbZUfVdk8AYAFKaXPwMzERERVR/DTS0SztIlGKzZyQpXQkRE5LgYbmqRxtUPAKDOY7ghIiKqKQw3tcjoLg2UNhSkKlwJERGR42K4qUXOvo0AAN6WVGQXmBWuhoiIyDEx3NQio1cIACBAlYaEjHyFqyEiInJMVQo3Fy9exKVLl+Tne/bswTPPPIMlS5bYrTCH5CZdgiGQ4YaIiKjGVCncPPTQQ9i+fTsAICEhAQMGDMCePXvw8ssv49VXX7VrgQ7FLUi6U+UiOTVF4WKIiIgcU5XCzd9//43u3bsDAL7//nt06NABv//+O5YvX45ly5bZsz7HYnBFntoFAJCdrMzVyYmIiBxdlcJNUVGRfKXtbdu24V//+hcAoE2bNoiPj7dfdQ4oxyidDl6YxnBDRERUE6oUbtq3b4/Fixfjf//7H7Zu3YqBAwcCAK5cuQJvb2+7FuhoipwDpQeZlypuSERERFVSpXDz9ttv49NPP0W/fv0wevRodOrUCQDw008/yYer6AaKx93octjDRUREVBOqdFXwfv36ISUlBZmZmfD09JSXjx8/Hk5OTnYrzhHpvBoDZwCX/ASlSyEiInJIVeq5ycvLQ0FBgRxsLly4gPnz5+PEiRPw8/Oza4GOxtmvGQDAz5KIvEKLwtUQERE5niqFm3vvvRdff/01ACA9PR3h4eGYN28ehg0bhkWLFtm1QEdj9A0FADRSpeDS1VxliyEiInJAVQo3Bw4cQJ8+fQAAa9asgb+/Py5cuICvv/4aH330kV0LdDQqjyYAgCBVCi4kZylcDRERkeOpUrjJzc2Fq6srAGDLli247777oFar0aNHD1y4cMGuBToc10BYoIFeZUFSPPcVERGRvVUp3LRo0QLr1q3DxYsXsXnzZtx9990AgKSkJLi5udm1QIej0SLLIF0dPCfxrMLFEBEROZ4qhZtZs2Zh2rRpCA0NRffu3REREQFA6sUJCwuza4GOqMBFujq4Oe28soUQERE5oCqdCn7//fejd+/eiI+Pl+e4AYC77roLw4cPt1txjkrl0RhI3QNdFifyIyIisrcqhRsACAgIQEBAgHx18EaNGnECv0py8msGnAFc86+g0GyFXlulDjQiIiIqR5W+Va1WK1599VW4u7ujSZMmaNKkCTw8PPDaa6/BarXau0aH4+wvzXUTjGRcTs9TuBoiIiLHUqWem5dffhlffPEF3nrrLfTq1QsAsHPnTsyZMwf5+fl444037Fqko1F5SqeDN1Kl4FxqDpr6OCtcERERkeOoUrj56quv8Pnnn8tXAweA2267DcHBwXjyyScZbm7GozEAaa6b/yZnAq05qzMREZG9VOmwVFpaGtq0aVNmeZs2bZCWllbtohyeayAsKmmum9SEi0pXQ0RE5FCqFG46deqEBQsWlFm+YMEC3HbbbdUuyuGpNcg1BgIA8pPPKVwMERGRY6nSYal33nkHQ4YMwbZt2+Q5bnbv3o2LFy9iw4YNdi3QUVncQ4C8S0B6nNKlEBEROZQq9dz07dsXJ0+exPDhw5Geno709HTcd999+Oeff/DNN9/Yu0aHpPMOBQCYci/BYhXKFkNERORAqjzPTVBQUJmBw4cPH8YXX3yBJUuWVLswR2fybQoACBTJuJKehxAvJ4UrIiIicgycPU4havl08GRcSM1VuBoiIiLHwXCjlOLTwUNUyTifmqNwMURERI6D4UYp8lw3qYhLyVS4GCIiIsdxS2Nu7rvvvgrXp6enV6eWhsU1EFaVFjqYkZ50EUBHpSsiIiJyCLcUbtzd3W+6fuzYsdUqqMFQa1DgHAhT9kWYU88rXQ0REZHDuKVws3Tp0pqqo2HyaAJkX4Q28yKsVgG1WqV0RURERPUex9woyOATCgDwtybhSgavDk5ERGQPDDcKUpe6OvjZZJ4xRUREZA8MN0oqPmOqkSoZZ5OzFS6GiIjIMSgebhYuXIjQ0FAYjUaEh4djz549FbZPT0/HpEmTEBgYCIPBgFatWtXf61mVDjcp7LkhIiKyhypffsEeVq1ahalTp2Lx4sUIDw/H/PnzERUVhRMnTsDPz69M+8LCQgwYMAB+fn5Ys2YNgoODceHCBXh4eNR+8fZQaq6bc0mc64aIiMgeFA0377//Pp544gk88sgjAIDFixdj/fr1+PLLLzF9+vQy7b/88kukpaXh999/h06nAwCEhobWZsn2VWqum6zki0pXQ0RE5BAUOyxVWFiI/fv3IzIy8loxajUiIyOxe/fucrf56aefEBERgUmTJsHf3x8dOnTAm2++CYvFcsP3KSgoQGZmps2tzlBrINwaAQB0WZeQW2hWuCAiIqL6T7Fwk5KSAovFAn9/f5vl/v7+SEhIKHebs2fPYs2aNbBYLNiwYQNmzpyJefPm4fXXX7/h+8TExMDd3V2+hYSE2PVzVJfG69q4m3Mcd0NERFRtig8ovhVWqxV+fn5YsmQJunbtilGjRuHll1/G4sWLb7jNjBkzkJGRId8uXqxjh39szphiuCEiIqouxcbc+Pj4QKPRIDEx0WZ5YmIiAgICyt0mMDAQOp0OGo1GXta2bVskJCSgsLAQer2+zDYGgwEGg8G+xduTB+e6ISIisifFem70ej26du2K2NhYeZnVakVsbCwiIiLK3aZXr144ffo0rFarvOzkyZMIDAwsN9jUCzang3OuGyIioupS9LDU1KlT8dlnn+Grr77CsWPHMHHiROTk5MhnT40dOxYzZsyQ20+cOBFpaWl4+umncfLkSaxfvx5vvvkmJk2apNRHqD4eliIiIrIrRU8FHzVqFJKTkzFr1iwkJCSgc+fO2LRpkzzIOC4uDmr1tfwVEhKCzZs349lnn8Vtt92G4OBgPP3003jxxReV+gjVV2qum/PJmRBCQKXiBTSJiIiqSiWEEEoXUZsyMzPh7u6OjIwMuLm5KV0OYLVAvO4HldWMiPyPse6lkfB3MypdFRERUZ1yK9/f9epsKYek1kDlLs11w0NTRERE1cdwUxdwUDEREZHdMNzUBaXCzZkk9twQERFVB8NNXVBqrpvTyey5ISIiqg6Gm7qgVM/NqcQshYshIiKq3xhu6oJS4SY+Ix+Z+UUKF0RERFR/MdzUBcXhJlidCjWsOJ3EQ1NERERVxXBTF7gGAmottLDAH1d5aIqIiKgaGG7qArUGKDXXzclE9twQERFVFcNNXVFq3M1J9twQERFVGcNNXVEq3HDMDRERUdUx3NQVpea64RlTREREVcdwU1cU99w006UCAE5x3A0REVGVMNzUFcXhpok6GQB4xhQREVEVMdzUFZ6hAAAfSzJ0MOMUx90QERFVCcNNXeEaCOicoYYFjVWJPGOKiIioihhu6gqVCvBuDgBoqkrgmBsiIqIqYripS3xaAgCaqa4gITMfGXk8Y4qIiOhWMdzUJd4tAADtDdKg4tNJPDRFRER0qxhu6hJvqeemjS4RAHAsnuGGiIjoVjHc1CXFY26CrZcBAEfjM5WshoiIqF5iuKlLig9LuRSlwRW5OHqF4YaIiOhWMdzUJUY3wMUfANBUFY/jCZmwWIXCRREREdUvDDd1TalxN/lFVpxLyVG4ICIiovqF4aauKR53081FusYUx90QERHdGoabuqZ4rps2+iQA4LgbIiKiW8RwU9cUDypuZOEZU0RERFXBcFPXFI+5cc+7ABWs7LkhIiK6RQw3dY1nKKAxQGPOQ2NVMlKyC5CUla90VURERPUGw01do9ECPq0AAH08igcVs/eGiIio0hhu6iK/tgCA7s7SZRj+YbghIiKqNIabuqg43LTVXAIA/HMlQ8lqiIiI6hWGm7rIrx0AIKjwPADg8EWGGyIiospiuKmL/NoAAJwyz0CrsuByeh5SswsULoqIiKh+YLipi9wbAzpnqCyF6O0pjbf56xJ7b4iIiCqD4aYuUqvl3pu+ntIZUww3RERElcNwU1f5SoOKOxuuAAD+upSuYDFERET1B8NNXVV8xlSoNQ4AcPhSBoQQSlZERERUL9SJcLNw4UKEhobCaDQiPDwce/bsqdR2K1euhEqlwrBhw2q2QCUUhxv3rFPQqlVIyS5AfAZnKiYiIroZxcPNqlWrMHXqVMyePRsHDhxAp06dEBUVhaSkpAq3O3/+PKZNm4Y+ffrUUqW1LKAjAECdehod/XQAeGiKiIioMhQPN++//z6eeOIJPPLII2jXrh0WL14MJycnfPnllzfcxmKxYMyYMZg7dy6aNWtWi9XWIhc/wCUAgMDd3skApENTREREVDFFw01hYSH279+PyMhIeZlarUZkZCR27959w+1effVV+Pn54bHHHrvpexQUFCAzM9PmVm8E3gYA6GaQZipmzw0REdHNKRpuUlJSYLFY4O/vb7Pc398fCQkJ5W6zc+dOfPHFF/jss88q9R4xMTFwd3eXbyEhIdWuu9YESOGmueUMAOl0cKuVg4qJiIgqovhhqVuRlZWFhx9+GJ999hl8fHwqtc2MGTOQkZEh3y5evFjDVdpR8bgbz8zjMOk0yMo343RytsJFERER1W1aJd/cx8cHGo0GiYmJNssTExMREBBQpv2ZM2dw/vx5DB06VF5mtVoBAFqtFidOnEDz5s1ttjEYDDAYDDVQfS0oPiylSjqKro1csPNcBvZfuIpW/q4KF0ZERFR3Kdpzo9fr0bVrV8TGxsrLrFYrYmNjERERUaZ9mzZtcOTIERw6dEi+/etf/0L//v1x6NCh+nXIqTI8QgGDG2ApRKRvOgBg/4WripZERERU1ynacwMAU6dORXR0NLp164bu3btj/vz5yMnJwSOPPAIAGDt2LIKDgxETEwOj0YgOHTrYbO/h4QEAZZY7BLVaOjR1YRfCTZcAhOAAww0REVGFFA83o0aNQnJyMmbNmoWEhAR07twZmzZtkgcZx8XFQa2uV0OD7CvgNuDCLjQznwEQgrMpOUjNLoC3Sz091EZERFTDVKKBzemfmZkJd3d3ZGRkwM3NTelybu7QCmDdRKBJLwy4+iJOJWXjs7HdMKCd/823JSIichC38v3dgLtE6oniM6aQcATdGrsD4LgbIiKiijDc1HW+bQGtCSjIxB0+0gzFHHdDRER0Yww3dZ1GCwR1BgB01UiT+R2+lI5Cs1XBooiIiOouhpv6ILgrAMA34294OulQYLbi7yu8zhQREVF5GG7qg0bdAACqy/vQtYkXAGDPuTQlKyIiIqqzGG7qg2Ap3CDxH/QOdQYA7D6TqmBBREREdRfDTX3g3ghw8QesZtzhehkAsPd8GoosHHdDRER0PYab+kClksfdhOYdg4eTDrmFFhy5zHE3RERE12O4qS+Kw436yn6EN5XG3fDQFBERUVkMN/VF8aBiXNqPHs28AQB/nGW4ISIiuh7DTX0R1AWACsiIQ+8AMwBg3/mrnO+GiIjoOgw39YXRDQiQrnzePPcIvJz1yCuy4K9L6crWRUREVMcw3NQnTXoBANRxv8vjbn7nuBsiIiIbDDf1SeMI6T5uN3q18AEA/O9UsoIFERER1T0MN/VJk57SfeI/6NdYBwA4EJeOjLwiBYsiIiKqWxhu6hMXP8C7BQCBRll/obmvMyxWgV2nU5SujIiIqM5guKlvSnpv4n5H31Z+AID/nuChKSIiohIMN/VN4+Jwc+F39GvtCwD478lkCCEULIqIiKjuYLipb5oUDyq+chDdgw0w6tRIyMzHicQsZesiIiKqIxhu6huPJoB7Y8BqhvHKn4gonq14Bw9NERERAWC4qX9UKqB5P+nx2R3o11oadxN7LFG5moiIiOoQhpv6qFk/6f7Mdgxo5w8A2HfhKpKzCpSriYiIqI5guKmPmvYDoAKS/kGQJhO3NXKHEMA29t4QEREx3NRLzt5A4G3S43P/RVT7AADApr8TFCyKiIiobmC4qa9KHZoa2EEKN7+fSeFsxURE1OAx3NRXzfpL92e3o7mPM1r6uaDIIrD9eJKydRERESmM4aa+ahwBaE1AVjyQ+Ld8aGrDkXiFCyMiIlIWw019pTMCzYt7b05sxD2dAgFI892k5xYqWBgREZGyGG7qs9aDpfvj69EmwA1tAlxRaLFiwxEOLCYiooaL4aY+azUQgAqIPwRkXMZ9XYIBAD8cvKRoWUREREpiuKnPXHyBkO7S45Mb8a9OwVCpgL3nr+JiWq6ytRERESmE4aa+kw9NbUCAuxE9m0vXmlp38LKCRRERESmH4aa+Kwk3534D8jMxoksjAMDKvRdhsQoFCyMiIlIGw01959sK8G4BWIuAU1swuGMg3E06XE7Pw28neaVwIiJqeBhuHEG7YdL9kTUw6jR4oKvUe/PtHxeUq4mIiEghDDeOoOMD0v3pbUBuGsb0aAIA+PVEEi5d5cBiIiJqWBhuHIFfG8C/g3Ro6tjPaOrjjN4tfCAE8O0fcUpXR0REVKsYbhxFx/ul+yOrAQDRPUMBAMv/vICsfF5Mk4iIGo46EW4WLlyI0NBQGI1GhIeHY8+ePTds+9lnn6FPnz7w9PSEp6cnIiMjK2zfYHQYId2f3wlcPY+72vihhZ8LsvLNWPEne2+IiKgWFGQDez8HctMULUPxcLNq1SpMnToVs2fPxoEDB9CpUydERUUhKan8q1vv2LEDo0ePxvbt27F7926EhITg7rvvxuXLDXxeF4/GxVcKF8CBr6FWq/CfO5oBAD7feQ75RRZl6yMiIsf101PAHHcgJhhY/xzwTlNFy1EJIRSdDCU8PBy33347FixYAACwWq0ICQnBlClTMH369Jtub7FY4OnpiQULFmDs2LE3bZ+ZmQl3d3dkZGTAzc2t2vXXKUd/BL4fC7j4A8/+g0KhQd93tyM+Ix9vDO+AMeFNlK6QiIgcxX/fAba/ceP1czLs+na38v2taM9NYWEh9u/fj8jISHmZWq1GZGQkdu/eXanXyM3NRVFREby8vMpdX1BQgMzMTJubw2o9GHD2A7ITgRMboNeqMb649+bj2NPsvSEioqoRAti/TOqdKbndKNj862O7B5tbpWi4SUlJgcVigb+/v81yf39/JCRU7srWL774IoKCgmwCUmkxMTFwd3eXbyEhIdWuu87S6ICwf0uP9y0FADwU3hjBHiYkZOZj2e/nlauNiIjqj/SLtkFmrgfw89M3bt/pISnQzMkAutz8KEpNU3zMTXW89dZbWLlyJX744QcYjcZy28yYMQMZGRny7eLFi7VcZS3rGg1ABZzdDiSfhEGrwbMDWgEAPtl+Ghm5PHOKiIhKKcoDlt1jG2bmd7j5do1uB16KlwLN8EU1X+ct0Cr55j4+PtBoNEhMTLRZnpiYiICAgAq3fe+99/DWW29h27ZtuO22227YzmAwwGAw2KXeesEzVDo8dWI98PuHwL0LMTwsGEt+O4OTidn4MPYUZg1tp3SVRESklLSz0gDg8/+7te0eWAa0/Reg1tRIWfakaM+NXq9H165dERsbKy+zWq2IjY1FRETEDbd755138Nprr2HTpk3o1q1bbZRav/R+Rro/vArIvAKNWoWXh0iB5qvd53H0igOPOyIiIltCAHF/XOuV+Sjs5sFm7I/A9Lhrh5rmZADth9eLYAMo3HMDAFOnTkV0dDS6deuG7t27Y/78+cjJycEjjzwCABg7diyCg4MRExMDAHj77bcxa9YsrFixAqGhofLYHBcXF7i4uCj2OeqUkO5A455A3O/A7oVA1Bvo28oXgzsGYMORBLyy7gjWTOgJtVqldKVERFRT1j0JHFpeubYzLgMGx/kOVTzcjBo1CsnJyZg1axYSEhLQuXNnbNq0SR5kHBcXB7X6WgfTokWLUFhYiPvvv9/mdWbPno05c+bUZul1W+9ngRW/S6Pbe08FnL0x6572+O+JZByIS8fyPXF4uAdPDScicijLHwBObbl5uykHAO/mNV+PQhSf56a2OfQ8N6UJAXx6B5DwF9D9P8DgdwAAX+48h1d/OQqTToP1T/VGM1/HSepERA3Cpf2ApQDwbQOkX5DObNr+JpB87MbbtB4MPLgCUNXfHvtb+f5muHFkZ3cAX98LqLXAk38CPi1gtQr8+4s/8fuZVHQK8cCaCRHQaer1SXNERI5HCODSXsDZV5q7LDsJSDsD7PkMyKzkjPydxwD3LqzXgaY0hpsKNKhwAwDLRwKnNgOthwCjVwAArqTnYeD835CZb8Z/7miGGYPbKlwkEVEDdWITsO8LaY4yIaQAk3oWOPRt5V/Dty2QfFw6mSSkB9CsL6Az1VjJSmG4qUCDCzfJJ4BPIgBhAcatB0J7AwA2HInHk8sPAAA+Hh2GoZ2ClKySiMixXd4PfHMf0CISCO4CpJ6RxsZkVHLutcDOgE8rwL0RsPN9adk984Fuj9RUxXUOw00FGly4AYBfpkr/M/DvCIzfLs1kDCBm4zF8+t+zMOrUWDOhJzoEuytcKBGRA8i7CuSlA4n/AFcOAic3A4lHbr6dsx/QvD/g1Vxqf+xnoNNo4N5PADWHDzDcVKBBhpucFGBBN+kfXN/pQP8ZAACLVWDc0j3436kU+LoasGZCBJp4OytcLBFRPZRxGfhzMXDwG+lvbUVaDQL82gA+rYF1E6RlM1MBjeInMNdpDDcVaJDhBgCOrAH+7zFpcPHjsUBQZwBARl4RRn26G8cTstDYywlrJkTAz638S1kQEVExqwXY+wWw8fmK23V6CGjUDchKAIzuQM/JtVOfA2K4qUCDDTdCAKujgaM/At4tgCd+lf6hAUjKyscDi3fjQmoumvk6Y/nj4Qh0d7zBaEREVWYxAwe/Bn55tuJ2HR8AjqyW5heLnF07tTUQDDcVaLDhBgByUoFP+0inEbYeAoz6Vj6OG5eaiweX7MaVjHwEe5iw/PFwhPrwEBURNRClvwrN+dLJGMfXA7+9U7nth34kXQ3bQU67rosYbirQoMMNIE3+tHQgYCkE7pwJ3DFNXnU5PQ///vxPnEvJgbezHosf7orbQ70ULJaIyM7+Xgv8MEGaBK861Drgmb8AN55pWlsYbirQ4MMNAOz/Cvj5KQAq4P4vgA4j5FXJWQUYt3QP/rmSCZ1GhVfv7YDR3RsrVysRUVUV5QMJR4BLe4ALvwPHf6ne6z11EPBqZp/a6JYx3FSA4abY+ueAvZ9LA4xHfQu0HiSvyi004/nVf2H9kXgAwH1hwZhzb3u4GXVKVUtEdGNCADnJQOppYOmgm7cv4RoIeIYCWqM0BjHs34BGL02AZ/IC3IOldTzUVCcw3FSA4aaY1SJ1zR75XvrH/OB3QMtIebUQAp/sOIN5W07AKoBgDxPeH9kJ4c28FSyaiBosIYD8dCDlNPDFtb9VCOwMpJ0DCjLK387gBgSFASHdgYDbpIlMnXi4vT5iuKkAw00pFjOw5hHg2E9SD869nwCdRtk02X8hDc+sOoSLaXkAgAe6NsL0QW3g7WJQomIicmRWi3QdpcsHgFVjpGXN+kvXVUqPAwqzKthYBXg0li4kWeLJP6SLS7LnxSEw3FSA4eY65kLgxyelUxcB4I4XgH7TAbVGbpKVX4Q31h/Dyr3SNOHuJh2eiWyJh8Ibw6DVlPeqRES2rFbg7Hbg2/uk53fOlCa7y7gk3TKvSMFGWCp+HbUOsBZJj1sMALpGS9NbeDYFdJyjy5Ex3FSA4aYcViuwdSawe4H0vFk/YPgSwNXfptn+C2l4Zd0/OBafCUA6VPXUXS0woksjaHllcSICpMNHuanShRwTjwJJ/xTfH7tJz0sxlcY24Ax4FfBrL/XKuDcC9E41VzvVaQw3FWC4qcBf3wM/Pw0U5UqD6+5+HQh72KZL12yxYtW+i/go9hQSM6VTKYM9TBjXMxSjuodw0DFRQ5J3VRoDk3JCuhBkyQUdKyO0DxDYSQosbsHS4F3XIMDFz6bnmKgEw00FGG5uIuk48MN4IP6w9LxJL2Dwu4B/e5tm+UUWfPvHBSzacQapOYUAAGe9BiO6NsIDXUPQIdgNKh7nJqr/hJAOF6Wekc5GSj0N/P7Rzbdzbwz4t5P+dvgV33u3APIzAGefmq+bHA7DTQUYbirBYpYuALf9DakXByqgw31AvxmAT0ubpvlFFvx46DK+2HkOJxOz5eWt/F0woksjDO4YiBAvdiMT1WlCSL0waeeuBZjU00DaGSnUFGbfeFsXf8CvrXQl65STwPn/Ac+dLHNYm6i6GG4qwHBzC65eALbNBv75oXiBCmgRCXR/Qrov1XUshMCu06lYvf8iNv2dgAKzVV7XLtANd7f3x4B2/mgXyB4dIkUU5UlnHF09L/3bvnpeOrOo5HFF42FUamnMi1dzqfcl8zKQkwI8sAxwC6yd+qnBY7ipAMNNFSQcAba/CZzYcG2ZRxOg02ig3b3S/9pKBZbM/CKs/yse6w5ext7zabCW+g3zcTGgZ3Nv9GrhjZ7NfdirQ2QvhTlAZrwUPDIulg0w2Qk3fw3XQCm8eDe/FmS8mxdPdMfpH0hZDDcVYLiphtQzwL4vgYPfSpNplfBpJc1w3Kw/0DjC5nTMtJxCxB5LxJajifjfqWTkF1ltXjLYw4TOjT0QFuKBsMYeaB/kDqOOgwmJbBRkS6El87J0ynTmleIQc/na49L/Jm9E7yoFFc8mxfeh0n9UPJtIPTM6U81+DqJqYLipAMONHRTmAsd+lg5XnYmVLsJZQmsEmvSUZgEN7ibNDGqU9nOB2YIDF9Lx+5kU7DqdgsOXMmCx2v76adUqtA5wRZsAN7QJcC1+7ApfVwMPZ5FjMRcCOUnSBHU5ycX3SUB2su3yzPgbz757PZ2zdNaRW/C1AONRKsiYPDmhHdVbDDcVYLixs/wM4OQW4Myv0q1M17dKGoTs11Y6Y6Lk3rMpss3A4YvpOHQxHQfjpPuU7PKv1OvppENLf1c09XZGqI8zmvo4IdTHGaHezuzpIeUJIQ2+z7sK5KZJ9yW33JRSgaVUcKlMT0tpBnfpCtTyLVi6LwkzbkHSpQYYXshBMdxUgOGmBgkhTdx1dgdw8U/g0n4gI678tiqN9EdZ7hIPhfAIQYrKC8eznfF3pglHUqw4npiN8yk5sFbwWxrobkQTbycEeZgQ7GFCkIcJge5G+bGzQVsjH5ccjKUIKMiSzgwqyJbuC7OB/Ewg7/rAUupxybrSPZiVpdYCzr7S3C7OfsX3pZ/7SuNg3IIAg6v9PzNRPcJwUwGGm1qWlQgkHpFmJ006DiQdlQJQUe7Nt9UaAWc/WI0eyNW4IgMuSLU4I9FswqU8A87n6pFSqEcODMgTRuTAgFwYkSuK72GAGVq4m3QIdDfC380IP1cDfItvfq7GUo8NDEH1idUCmPMBc4F0X5Atne1TEkoq9TzbNsxYyu81vCVqnXRRRpOndFVpkyfg5HmD4OIHGD0ANWf3JqoMhpsKMNzUAVarNClYyVkcpe+zk6RDW/mVHGNwEwVCKwedQqFFIXQoRPG9uPa4AFpY1XqodQZodUZo9NK9zmCEQa+HwaCHSW+A0aiHk8EAJ6MBBoMBarVG+t+3WiudGi8/LnlezjKVuvwbVNIhhTLrrj/McN1zm/UVrBMCgACEtfhW6jGE7fMy66xSoLBaAKu5glup9ZaisstKlpvzpZ6O0gHFXGD72FL6+XVtb3b9oerQ6AG9C2BwkQbgGlxLBZZSt+tDjMkT0DvzsBBRDbmV72/+V5Vqn1otzY3hFgg07lF+m6I8KQBlJ0tjE/KuAnnptocG8q5Kp78WZks9QSWPC3OkL1EABpUZBmTDE9llvvfLZS6+5dnno1ItUGulUKF3LQ4kLtfuSz+Ww8qNnrtK91q90p+IiKqJ4YbqJp3p2hkeVWEuvC705BT3FBTY9hgULysoyENOTi5y83KRn5eHwoI8FBXmoaiwEEVFRSgqKoLZLN2s5iIIqxkaWKGBFVpYoIEFWpUVGliuW35tvQZWaFUWqACoIaBVCahVAhpI92pYoS5ep4IVKgiohVUOZdLdtY5WFVDcG1Piuk7YMutUtr1BNj1GJctRTm9Scfsb9k5ppcMxN1qv0V3XXicdctQaiu/1155rDKWWX/e4vHUaA6DhnzEissW/CuSYtHpA6wXAq1LNDcW3yrUGCs1WpOcWIi23EGk5hbiaU4SruYXIzC9CRl4RMvPMyMwrfly8TFpeVOHg6FvhYtDC1ai9dm/UwdVmmQ4uRum5q0Fb/FgHF4MWbkbpuUmn4Sn2RORwGG6IqkCvVcPPzQg/N+PNG5cihEB2gVkOQBmlAlBmcfiRnpuRlW9GdkFR8b30PCu/CEUWKR1lF0jLq0OjVsFZr4GzQSvfXAwaOOmlgORsKF6nL29d8Xp5nRZGnZphiYgUx3BDVItUKhVcjTq4GnWAZ9VeI7/IIgWb4gCUVRKAisNPdoEZWcVhyGaZHJikZVYBWKwCmflmZOZXLySVUKsgh53rg1HJcxeDFk56TamApIWzXgOTXgpOpR876TUwaBmYiOjWMNwQ1TNGnQZGnQY+LlW/1o8QAnlFFjns5BSYkVNgke4LpWW5BZZr6wqvrc8ufm67Xjp7ySogBatq9iiVplYBTnptceC5FnqcSj036TVw0mngZLi2zqSTwlTJOvmxXgMnnRZOBg10Gp6GTeSIGG6IGiCVSlUcErTwt8PrWa1SWCoJP7mF14JPyXM5GBWHoZzr1ucWWpBXaEFuobS+sPjK8lZhn0Nw5dFpVDDpigOT4VrwMek1cDZoYNJpYdKrYdJJYcmo18ColXqWTDoNjDo1jMXrTHqN/NhYvM6k00DLAEVU6xhuiKja1GqVfIjJz06vabZYkVckBZ6c4tBT8jiv8Fogyi1+nFf8PKe4Xcmy0s9L2pqLR3UXWQSKLPY7LFcenUYl97bZhiR1qZBUar1eLQeoMst1tgHKoFVLt+LHWrWKh/CIwHBDRHWUVqOGq0YtjU+ys0KzVQ4+pXuMSgcmqSfKgvwi6VYStPKLt7VZXmRBfql1eUXXJhksCVBZNRigSqhVgEGrgUFXHHq0xQFIV+pxyfKbtZHDUznrSz3Wa9XQaaRtdBo1NGqGK1Ieww0RNTj64i9ldyf7BydAGtNUYLbahqIiqScqXw5JllIhySqHpLxCCwqK1+UV3Wg76bULzVYUWqzy+1oF5NdRikatgl6jhk6jgl6rgV6jkgNQ6Xu9/Fxqp9OopLCkqajttW0MFbymXqOGTquGTq2CruSxRgWdWg01w1eDwHBDRGRnKtW1Q1EeNfxeVqtAocWKgiIrCswWFJil+/wiq/y4wHz9eisKiko9NluK15duX7ntSqYmKGGxCuRZLcgrAqTpvusWjVolB52S0KNVXwtN2uLl+hs81mmk8KbVSMGp9GNdcajTadTQaq5tV7Jeq1YVh66SZSXLi9+7uI2++BCjVlOqJg0POd4KhhsionpMrVbBqJaCFFAzPVEVsVoFiqxWFJqtKLKI4nsp/BRZrPLzkl4muZ3FgiKzQIHFiqLidSX3hddtV/K6ZZeXeh+LFUVmIb9OgUVafv3VEy1WAYtVIB9WwA7XSq1NUuBRlQpH6uIgpZKCU6nnJW0qDlnFgaok7Gmuha6SQFUS1MouK/tcfm2NNJ6sOmd0VntfKfbOpSxcuBDvvvsuEhIS0KlTJ3z88cfo3r37DduvXr0aM2fOxPnz59GyZUu8/fbbGDx4cC1WTEREgBSuDGoNDFqN0qWUy2IVKCoOOtL4p0o8NlthtlpReIPHJWGrzGOzuBa0LFaYLQJFVgFz8eNCi9TOXPx+Zqv0mqXbFBW/pqWcqczNVgFzSTCr4zqFeODHSb0Ue3/Fw82qVaswdepULF68GOHh4Zg/fz6ioqJw4sQJ+PmVPe/i999/x+jRoxETE4N77rkHK1aswLBhw3DgwAF06NBBgU9ARER1lUatgkbu2ao/rMVBpiRAyWGpOBSZi0OU2XKtjblUSDNbbbcpE6DMpbeT2kuvdy2UlQ5zFdVRVDqsFa836ZSdAkElxPWddrUrPDwct99+OxYsWAAAsFqtCAkJwZQpUzB9+vQy7UeNGoWcnBz88ssv8rIePXqgc+fOWLx48U3f71YumU5ERER1w618fysarQoLC7F//35ERkbKy9RqNSIjI7F79+5yt9m9e7dNewCIioq6YfuCggJkZmba3IiIiMhxKRpuUlJSYLFY4O9vO0eqv78/EhISyt0mISHhltrHxMTA3d1dvoWEhNineCIiIqqTHH5e8BkzZiAjI0O+Xbx4UemSiIiIqAYpOqDYx8cHGo0GiYmJNssTExMREBBQ7jYBAQG31N5gMMBgUO50NCIiIqpdivbc6PV6dO3aFbGxsfIyq9WK2NhYRERElLtNRESETXsA2Lp16w3bExERUcOi+KngU6dORXR0NLp164bu3btj/vz5yMnJwSOPPAIAGDt2LIKDgxETEwMAePrpp9G3b1/MmzcPQ4YMwcqVK7Fv3z4sWbJEyY9BREREdYTi4WbUqFFITk7GrFmzkJCQgM6dO2PTpk3yoOG4uDio1dc6mHr27IkVK1bglVdewUsvvYSWLVti3bp1nOOGiIiIANSBeW5qG+e5ISIiqn/qzTw3RERERPbGcENEREQOheGGiIiIHArDDRERETkUhhsiIiJyKAw3RERE5FAUn+emtpWc+c6rgxMREdUfJd/blZnBpsGFm6ysLADg1cGJiIjqoaysLLi7u1fYpsFN4me1WnHlyhW4urpCpVLZ9bUzMzMREhKCixcvcoLAGsT9XDu4n2sH93Pt4b6uHTW1n4UQyMrKQlBQkM2VC8rT4Hpu1Go1GjVqVKPv4ebmxn84tYD7uXZwP9cO7ufaw31dO2piP9+sx6YEBxQTERGRQ2G4ISIiIofCcGNHBoMBs2fPhsFgULoUh8b9XDu4n2sH93Pt4b6uHXVhPze4AcVERETk2NhzQ0RERA6F4YaIiIgcCsMNERERORSGGyIiInIoDDd2snDhQoSGhsJoNCI8PBx79uxRuqQ6LSYmBrfffjtcXV3h5+eHYcOG4cSJEzZt8vPzMWnSJHh7e8PFxQUjRoxAYmKiTZu4uDgMGTIETk5O8PPzw/PPPw+z2WzTZseOHejSpQsMBgNatGiBZcuW1fTHq5PeeustqFQqPPPMM/Iy7mP7uXz5Mv7973/D29sbJpMJHTt2xL59++T1QgjMmjULgYGBMJlMiIyMxKlTp2xeIy0tDWPGjIGbmxs8PDzw2GOPITs726bNX3/9hT59+sBoNCIkJATvvPNOrXy+usBisWDmzJlo2rQpTCYTmjdvjtdee83mWkPcz7fut99+w9ChQxEUFASVSoV169bZrK/Nfbp69Wq0adMGRqMRHTt2xIYNG6r2oQRV28qVK4Verxdffvml+Oeff8QTTzwhPDw8RGJiotKl1VlRUVFi6dKl4u+//xaHDh0SgwcPFo0bNxbZ2dlymwkTJoiQkBARGxsr9u3bJ3r06CF69uwprzebzaJDhw4iMjJSHDx4UGzYsEH4+PiIGTNmyG3Onj0rnJycxNSpU8XRo0fFxx9/LDQajdi0aVOtfl6l7dmzR4SGhorbbrtNPP300/Jy7mP7SEtLE02aNBHjxo0Tf/75pzh79qzYvHmzOH36tNzmrbfeEu7u7mLdunXi8OHD4l//+pdo2rSpyMvLk9sMHDhQdOrUSfzxxx/if//7n2jRooUYPXq0vD4jI0P4+/uLMWPGiL///lt89913wmQyiU8//bRWP69S3njjDeHt7S1++eUXce7cObF69Wrh4uIiPvzwQ7kN9/Ot27Bhg3j55ZfF2rVrBQDxww8/2KyvrX26a9cuodFoxDvvvCOOHj0qXnnlFaHT6cSRI0du+TMx3NhB9+7dxaRJk+TnFotFBAUFiZiYGAWrql+SkpIEAPHf//5XCCFEenq60Ol0YvXq1XKbY8eOCQBi9+7dQgjpH6RarRYJCQlym0WLFgk3NzdRUFAghBDihRdeEO3bt7d5r1GjRomoqKia/kh1RlZWlmjZsqXYunWr6Nu3rxxuuI/t58UXXxS9e/e+4Xqr1SoCAgLEu+++Ky9LT08XBoNBfPfdd0IIIY4ePSoAiL1798ptNm7cKFQqlbh8+bIQQohPPvlEeHp6yvu+5L1bt25t749UJw0ZMkQ8+uijNsvuu+8+MWbMGCEE97M9XB9uanOfjhw5UgwZMsSmnvDwcPGf//znlj8HD0tVU2FhIfbv34/IyEh5mVqtRmRkJHbv3q1gZfVLRkYGAMDLywsAsH//fhQVFdns1zZt2qBx48byft29ezc6duwIf39/uU1UVBQyMzPxzz//yG1Kv0ZJm4b0s5k0aRKGDBlSZj9wH9vPTz/9hG7duuGBBx6An58fwsLC8Nlnn8nrz507h4SEBJv95O7ujvDwcJt97eHhgW7dusltIiMjoVar8eeff8pt7rjjDuj1erlNVFQUTpw4gatXr9b0x1Rcz549ERsbi5MnTwIADh8+jJ07d2LQoEEAuJ9rQm3uU3v+LWG4qaaUlBRYLBabP/4A4O/vj4SEBIWqql+sViueeeYZ9OrVCx06dAAAJCQkQK/Xw8PDw6Zt6f2akJBQ7n4vWVdRm8zMTOTl5dXEx6lTVq5ciQMHDiAmJqbMOu5j+zl79iwWLVqEli1bYvPmzZg4cSKeeuopfPXVVwCu7auK/k4kJCTAz8/PZr1Wq4WXl9ct/Twc2fTp0/Hggw+iTZs20Ol0CAsLwzPPPIMxY8YA4H6uCbW5T2/Upir7vMFdFZzqnkmTJuHvv//Gzp07lS7FoVy8eBFPP/00tm7dCqPRqHQ5Ds1qtaJbt2548803AQBhYWH4+++/sXjxYkRHRytcneP4/vvvsXz5cqxYsQLt27fHoUOH8MwzzyAoKIj7mWyw56aafHx8oNFoypxhkpiYiICAAIWqqj8mT56MX375Bdu3b0ejRo3k5QEBASgsLER6erpN+9L7NSAgoNz9XrKuojZubm4wmUz2/jh1yv79+5GUlIQuXbpAq9VCq9Xiv//9Lz766CNotVr4+/tzH9tJYGAg2rVrZ7Osbdu2iIuLA3BtX1X0dyIgIABJSUk2681mM9LS0m7p5+HInn/+ebn3pmPHjnj44Yfx7LPPyj2T3M/2V5v79EZtqrLPGW6qSa/Xo2vXroiNjZWXWa1WxMbGIiIiQsHK6jYhBCZPnowffvgBv/76K5o2bWqzvmvXrtDpdDb79cSJE4iLi5P3a0REBI4cOWLzj2rr1q1wc3OTv2giIiJsXqOkTUP42dx11104cuQIDh06JN+6deuGMWPGyI+5j+2jV69eZaYyOHnyJJo0aQIAaNq0KQICAmz2U2ZmJv7880+bfZ2eno79+/fLbX799VdYrVaEh4fLbX777TcUFRXJbbZu3YrWrVvD09Ozxj5fXZGbmwu12vZrS6PRwGq1AuB+rgm1uU/t+rfklocgUxkrV64UBoNBLFu2TBw9elSMHz9eeHh42JxhQrYmTpwo3N3dxY4dO0R8fLx8y83NldtMmDBBNG7cWPz6669i3759IiIiQkRERMjrS05Tvvvuu8WhQ4fEpk2bhK+vb7mnKT///PPi2LFjYuHChQ3uNOXSSp8tJQT3sb3s2bNHaLVa8cYbb4hTp06J5cuXCycnJ/Htt9/Kbd566y3h4eEhfvzxR/HXX3+Je++9t9zTacPCwsSff/4pdu7cKVq2bGlzOm16errw9/cXDz/8sPj777/FypUrhZOTk8Oeony96OhoERwcLJ8KvnbtWuHj4yNeeOEFuQ33863LysoSBw8eFAcPHhQAxPvvvy8OHjwoLly4IISovX26a9cuodVqxXvvvSeOHTsmZs+ezVPBlfbxxx+Lxo0bC71eL7p37y7++OMPpUuq0wCUe1u6dKncJi8vTzz55JPC09NTODk5ieHDh4v4+Hib1zl//rwYNGiQMJlMwsfHRzz33HOiqKjIps327dtF586dhV6vF82aNbN5j4bm+nDDfWw/P//8s+jQoYMwGAyiTZs2YsmSJTbrrVarmDlzpvD39xcGg0Hcdddd4sSJEzZtUlNTxejRo4WLi4twc3MTjzzyiMjKyrJpc/jwYdG7d29hMBhEcHCweOutt2r8s9UVmZmZ4umnnxaNGzcWRqNRNGvWTLz88ss2pxdzP9+67du3l/v3ODo6WghRu/v0+++/F61atRJ6vV60b99erF+/vkqfSSVEqakdiYiIiOo5jrkhIiIih8JwQ0RERA6F4YaIiIgcCsMNERERORSGGyIiInIoDDdERETkUBhuiIiIyKEw3BBRg6RSqbBu3TqlyyCiGsBwQ0S1bty4cVCpVGVuAwcOVLo0InIAWqULIKKGaeDAgVi6dKnNMoPBoFA1RORI2HNDRIowGAwICAiwuZVcHVilUmHRokUYNGgQTCYTmjVrhjVr1thsf+TIEdx5550wmUzw9vbG+PHjkZ2dbdPmyy+/RPv27WEwGBAYGIjJkyfbrE9JScHw4cPh5OSEli1b4qeffpLXXb16FWPGjIGvry9MJhNatmxZJowRUd3EcENEddLMmTMxYsQIHD58GGPGjMGDDz6IY8eOAQBycnIQFRUFT09P7N27F6tXr8a2bdtswsuiRYswadIkjB8/HkeOHMFPP/2EFi1a2LzH3LlzMXLkSPz1118YPHgwxowZg7S0NPn9jx49io0bN+LYsWNYtGgRfHx8am8HEFHVVelym0RE1RAdHS00Go1wdna2ub3xxhtCCOmq8RMmTLDZJjw8XEycOFEIIcSSJUuEp6enyM7OltevX79eqNVqkZCQIIQQIigoSLz88ss3rAGAeOWVV+Tn2dnZAoDYuHGjEEKIoUOHikceecQ+H5iIahXH3BCRIvr3749FixbZLPPy8pIfR0RE2KyLiIjAoUOHAADHjh1Dp06d4OzsLK/v1asXrFYrTpw4AZVKhStXruCuu+6qsIbbbrtNfuzs7Aw3NzckJSUBACZOnIgRI0bgwIEDuPvuuzFs2DD07NmzSp+ViGoXww0RKcLZ2bnMYSJ7MZlMlWqn0+lsnqtUKlitVgDAoEGDcOHCBWzYsAFbt27FXXfdhUmTJuG9996ze71EZF8cc0NEddIff/xR5nnbtm0BAG3btsXhw4eRk5Mjr9+1axfUajVat24NV1dXhIaGIjY2tlo1+Pr6Ijo6Gt9++y3mz5+PJUuWVOv1iKh2sOeGiBRRUFCAhIQEm2VarVYetLt69Wp069YNvXv3xvLly7Fnzx588cUXAIAxY8Zg9uzZiI6Oxpw5c5CcnIwpU6bg4Ycfhr+/PwBgzpw5mDBhAvz8/DBo0CBkZWVh165dmDJlSqXqmzVrFrp27Yr27dujoKAAv/zyixyuiKhuY7ghIkVs2rQJgYGBNstat26N48ePA5DOZFq5ciWefPJJBAYG4rvvvkO7du0AAE5OTti8eTOefvpp3H777XBycsKIESPw/vvvy68VHR2N/Px8fPDBB5g2bRp8fHxw//33V7o+vV6PGTNm4Pz58zCZTOjTpw9Wrlxph09ORDVNJYQQShdBRFSaSqXCDz/8gGHDhildChHVQxxzQ0RERA6F4YaIiIgcCsfcEFGdw6PlRFQd7LkhIiIih8JwQ0RERA6F4YaIiIgcCsMNERERORSGGyIiInIoDDdERETkUBhuiIiIyKEw3BAREZFDYbghIiIih/L/y/cG7V6ONWEAAAAASUVORK5CYII=", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "model = ANN()\n", "\n", "# Define loss function and optimizer\n", "criterion = nn.CrossEntropyLoss() # NOTE: We select a classification loss\n", "optimizer = optim.Adam(model.parameters(), lr=0.001)\n", "\n", "# Training the model\n", "epochs = 10000\n", "train_losses = []\n", "test_losses = []\n", "for epoch in range(epochs):\n", " optimizer.zero_grad()\n", " outputs = model(X_train)\n", " loss = criterion(outputs, y_train)\n", " loss.backward()\n", " optimizer.step()\n", " train_losses.append(loss.item())\n", "\n", " # Evaluation step on testing set\n", " with torch.no_grad():\n", " test_outputs = model(X_test)\n", " test_loss = criterion(test_outputs, y_test)\n", " test_losses.append(test_loss.item())\n", "\n", " print(f'Epoch {epoch+1}/{epochs}, Training Loss: {loss.item()}, Test Loss: {test_loss.item()}')\n", "\n", "# Plotting the training and testing losses over epochs\n", "plt.plot(range(epochs), train_losses, label='Training Loss')\n", "plt.plot(range(epochs), test_losses, label='Testing Loss')\n", "plt.xlabel('Epochs')\n", "plt.ylabel('Loss')\n", "plt.title('Training and Testing Loss Over Epochs')\n", "plt.legend()\n", "plt.show()\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Evaluation\n", "\n", "From the run above, it seems like we should stop training after around 3,000 epochs. Let's re-run the training, stopping at 3,000 epochs, after which we will discuss further how this model can be evaluated." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1/3000, Training Loss: 1.2640759944915771, Test Loss: 1.27169930934906\n", "Epoch 2/3000, Training Loss: 1.2523120641708374, Test Loss: 1.2588136196136475\n", "Epoch 3/3000, Training Loss: 1.2410619258880615, Test Loss: 1.246541976928711\n", "Epoch 4/3000, Training Loss: 1.2302714586257935, Test Loss: 1.234955906867981\n", "Epoch 5/3000, Training Loss: 1.2199515104293823, Test Loss: 1.224029541015625\n", "Epoch 6/3000, Training Loss: 1.210060477256775, Test Loss: 1.2137163877487183\n", "Epoch 7/3000, Training Loss: 1.2006133794784546, Test Loss: 1.2039904594421387\n", "Epoch 8/3000, Training Loss: 1.1915860176086426, Test Loss: 1.1948440074920654\n", "Epoch 9/3000, Training Loss: 1.1829493045806885, Test Loss: 1.1862380504608154\n", "Epoch 10/3000, Training Loss: 1.1747429370880127, Test Loss: 1.178145408630371\n", "Epoch 11/3000, Training Loss: 1.1669117212295532, Test Loss: 1.1705280542373657\n", "Epoch 12/3000, Training Loss: 1.1594367027282715, Test Loss: 1.163350224494934\n", "Epoch 13/3000, Training Loss: 1.1523357629776, Test Loss: 1.1565824747085571\n", "Epoch 14/3000, Training Loss: 1.145584225654602, Test Loss: 1.1501802206039429\n", "Epoch 15/3000, Training Loss: 1.1391065120697021, Test Loss: 1.1441161632537842\n", "Epoch 16/3000, Training Loss: 1.1328951120376587, Test Loss: 1.1383739709854126\n", "Epoch 17/3000, Training Loss: 1.1269614696502686, Test Loss: 1.1329452991485596\n", "Epoch 18/3000, Training Loss: 1.121305227279663, Test Loss: 1.1278512477874756\n", "Epoch 19/3000, Training Loss: 1.1158807277679443, Test Loss: 1.1230573654174805\n", "Epoch 20/3000, Training Loss: 1.1106503009796143, Test Loss: 1.1185215711593628\n", "Epoch 21/3000, Training Loss: 1.1056286096572876, Test Loss: 1.114198923110962\n", "Epoch 22/3000, Training Loss: 1.100834846496582, Test Loss: 1.1101927757263184\n", "Epoch 23/3000, Training Loss: 1.0962785482406616, Test Loss: 1.1065083742141724\n", "Epoch 24/3000, Training Loss: 1.0920391082763672, Test Loss: 1.1030521392822266\n", "Epoch 25/3000, Training Loss: 1.0880424976348877, Test Loss: 1.099786400794983\n", "Epoch 26/3000, Training Loss: 1.0842459201812744, Test Loss: 1.096688151359558\n", "Epoch 27/3000, Training Loss: 1.0805562734603882, Test Loss: 1.0937355756759644\n", "Epoch 28/3000, Training Loss: 1.076967716217041, Test Loss: 1.0908997058868408\n", "Epoch 29/3000, Training Loss: 1.0734919309616089, Test Loss: 1.0881552696228027\n", "Epoch 30/3000, Training Loss: 1.0701206922531128, Test Loss: 1.085484266281128\n", "Epoch 31/3000, Training Loss: 1.0668364763259888, Test Loss: 1.082849383354187\n", "Epoch 32/3000, Training Loss: 1.0636320114135742, Test Loss: 1.080229640007019\n", "Epoch 33/3000, Training Loss: 1.0604945421218872, Test Loss: 1.0775994062423706\n", "Epoch 34/3000, Training Loss: 1.057410717010498, Test Loss: 1.0749439001083374\n", "Epoch 35/3000, Training Loss: 1.0543670654296875, Test Loss: 1.0722380876541138\n", "Epoch 36/3000, Training Loss: 1.0513490438461304, Test Loss: 1.0694663524627686\n", "Epoch 37/3000, Training Loss: 1.0483425855636597, Test Loss: 1.0666128396987915\n", "Epoch 38/3000, Training Loss: 1.0453417301177979, Test Loss: 1.063666820526123\n", "Epoch 39/3000, Training Loss: 1.0423424243927002, Test Loss: 1.0606235265731812\n", "Epoch 40/3000, Training Loss: 1.039341688156128, Test Loss: 1.0574989318847656\n", "Epoch 41/3000, Training Loss: 1.036332607269287, Test Loss: 1.0542974472045898\n", "Epoch 42/3000, Training Loss: 1.033300757408142, Test Loss: 1.051009178161621\n", "Epoch 43/3000, Training Loss: 1.0302507877349854, Test Loss: 1.047642469406128\n", "Epoch 44/3000, Training Loss: 1.0271908044815063, Test Loss: 1.044197678565979\n", "Epoch 45/3000, Training Loss: 1.0241268873214722, Test Loss: 1.0406802892684937\n", "Epoch 46/3000, Training Loss: 1.0210646390914917, Test Loss: 1.0371042490005493\n", "Epoch 47/3000, Training Loss: 1.0180052518844604, Test Loss: 1.0334787368774414\n", "Epoch 48/3000, Training Loss: 1.0149351358413696, Test Loss: 1.0298172235488892\n", "Epoch 49/3000, Training Loss: 1.011849045753479, Test Loss: 1.0261273384094238\n", "Epoch 50/3000, Training Loss: 1.0087536573410034, Test Loss: 1.0223971605300903\n", "Epoch 51/3000, Training Loss: 1.0056618452072144, Test Loss: 1.018624186515808\n", "Epoch 52/3000, Training Loss: 1.0025454759597778, Test Loss: 1.0148059129714966\n", "Epoch 53/3000, Training Loss: 0.9994217157363892, Test Loss: 1.0109554529190063\n", "Epoch 54/3000, Training Loss: 0.9962747097015381, Test Loss: 1.0070828199386597\n", "Epoch 55/3000, Training Loss: 0.9931063055992126, Test Loss: 1.003200888633728\n", "Epoch 56/3000, Training Loss: 0.9899594187736511, Test Loss: 0.9993086457252502\n", "Epoch 57/3000, Training Loss: 0.9868112206459045, Test Loss: 0.9954110980033875\n", "Epoch 58/3000, Training Loss: 0.983668327331543, Test Loss: 0.9915145039558411\n", "Epoch 59/3000, Training Loss: 0.9805490970611572, Test Loss: 0.9876216650009155\n", "Epoch 60/3000, Training Loss: 0.9774302244186401, Test Loss: 0.9837351441383362\n", "Epoch 61/3000, Training Loss: 0.9743132591247559, Test Loss: 0.9798592329025269\n", "Epoch 62/3000, Training Loss: 0.9712060689926147, Test Loss: 0.975984513759613\n", "Epoch 63/3000, Training Loss: 0.9680915474891663, Test Loss: 0.9721161127090454\n", "Epoch 64/3000, Training Loss: 0.9649748802185059, Test Loss: 0.968251645565033\n", "Epoch 65/3000, Training Loss: 0.9618940353393555, Test Loss: 0.9644227027893066\n", "Epoch 66/3000, Training Loss: 0.9588298797607422, Test Loss: 0.9605921506881714\n", "Epoch 67/3000, Training Loss: 0.9557666778564453, Test Loss: 0.9567680954933167\n", "Epoch 68/3000, Training Loss: 0.9526993036270142, Test Loss: 0.9529523849487305\n", "Epoch 69/3000, Training Loss: 0.9496186375617981, Test Loss: 0.9491115212440491\n", "Epoch 70/3000, Training Loss: 0.9464993476867676, Test Loss: 0.9452773928642273\n", "Epoch 71/3000, Training Loss: 0.9433677196502686, Test Loss: 0.9414547681808472\n", "Epoch 72/3000, Training Loss: 0.9402310252189636, Test Loss: 0.9376351833343506\n", "Epoch 73/3000, Training Loss: 0.9370895624160767, Test Loss: 0.9338263869285583\n", "Epoch 74/3000, Training Loss: 0.9339436888694763, Test Loss: 0.9300621747970581\n", "Epoch 75/3000, Training Loss: 0.9307897090911865, Test Loss: 0.9263094067573547\n", "Epoch 76/3000, Training Loss: 0.9276247024536133, Test Loss: 0.9225667119026184\n", "Epoch 77/3000, Training Loss: 0.9244548678398132, Test Loss: 0.9188277125358582\n", "Epoch 78/3000, Training Loss: 0.9212808012962341, Test Loss: 0.9151037335395813\n", "Epoch 79/3000, Training Loss: 0.9180876612663269, Test Loss: 0.9114034175872803\n", "Epoch 80/3000, Training Loss: 0.9148722290992737, Test Loss: 0.9077056050300598\n", "Epoch 81/3000, Training Loss: 0.9116887450218201, Test Loss: 0.9040129780769348\n", "Epoch 82/3000, Training Loss: 0.9085201025009155, Test Loss: 0.9003388285636902\n", "Epoch 83/3000, Training Loss: 0.9053446650505066, Test Loss: 0.8966683149337769\n", "Epoch 84/3000, Training Loss: 0.9021593332290649, Test Loss: 0.893000602722168\n", "Epoch 85/3000, Training Loss: 0.8989878296852112, Test Loss: 0.8893291354179382\n", "Epoch 86/3000, Training Loss: 0.8958309888839722, Test Loss: 0.8856792449951172\n", "Epoch 87/3000, Training Loss: 0.8926759958267212, Test Loss: 0.8820249438285828\n", "Epoch 88/3000, Training Loss: 0.8895533084869385, Test Loss: 0.8783535957336426\n", "Epoch 89/3000, Training Loss: 0.8864481449127197, Test Loss: 0.8746686577796936\n", "Epoch 90/3000, Training Loss: 0.8833396434783936, Test Loss: 0.870990514755249\n", "Epoch 91/3000, Training Loss: 0.8802304863929749, Test Loss: 0.8673170208930969\n", "Epoch 92/3000, Training Loss: 0.8771556615829468, Test Loss: 0.8636322021484375\n", "Epoch 93/3000, Training Loss: 0.8740807175636292, Test Loss: 0.85992431640625\n", "Epoch 94/3000, Training Loss: 0.870998203754425, Test Loss: 0.8561899065971375\n", "Epoch 95/3000, Training Loss: 0.8679099678993225, Test Loss: 0.8524385094642639\n", "Epoch 96/3000, Training Loss: 0.8648164868354797, Test Loss: 0.8486745357513428\n", "Epoch 97/3000, Training Loss: 0.8617183566093445, Test Loss: 0.8449020981788635\n", "Epoch 98/3000, Training Loss: 0.8586162328720093, Test Loss: 0.841124415397644\n", "Epoch 99/3000, Training Loss: 0.8555155396461487, Test Loss: 0.8373358845710754\n", "Epoch 100/3000, Training Loss: 0.852409839630127, Test Loss: 0.8335404396057129\n", "Epoch 101/3000, Training Loss: 0.8493000864982605, Test Loss: 0.8297463655471802\n", "Epoch 102/3000, Training Loss: 0.8461863398551941, Test Loss: 0.8259825706481934\n", "Epoch 103/3000, Training Loss: 0.843070387840271, Test Loss: 0.8222308158874512\n", "Epoch 104/3000, Training Loss: 0.839952826499939, Test Loss: 0.8184829354286194\n", "Epoch 105/3000, Training Loss: 0.8368342518806458, Test Loss: 0.8147412538528442\n", "Epoch 106/3000, Training Loss: 0.8337228894233704, Test Loss: 0.8110154271125793\n", "Epoch 107/3000, Training Loss: 0.8306151032447815, Test Loss: 0.8073061108589172\n", "Epoch 108/3000, Training Loss: 0.8275099396705627, Test Loss: 0.8036088943481445\n", "Epoch 109/3000, Training Loss: 0.8244063854217529, Test Loss: 0.7999290227890015\n", "Epoch 110/3000, Training Loss: 0.8213066458702087, Test Loss: 0.7962713837623596\n", "Epoch 111/3000, Training Loss: 0.8182163238525391, Test Loss: 0.7926326394081116\n", "Epoch 112/3000, Training Loss: 0.8151413202285767, Test Loss: 0.7890164852142334\n", "Epoch 113/3000, Training Loss: 0.8120710849761963, Test Loss: 0.7854165434837341\n", "Epoch 114/3000, Training Loss: 0.8090137839317322, Test Loss: 0.7818260788917542\n", "Epoch 115/3000, Training Loss: 0.8059735894203186, Test Loss: 0.7782528400421143\n", "Epoch 116/3000, Training Loss: 0.8029385209083557, Test Loss: 0.7746865153312683\n", "Epoch 117/3000, Training Loss: 0.7999091148376465, Test Loss: 0.7711400985717773\n", "Epoch 118/3000, Training Loss: 0.7968928217887878, Test Loss: 0.7675983905792236\n", "Epoch 119/3000, Training Loss: 0.7938875555992126, Test Loss: 0.7640687227249146\n", "Epoch 120/3000, Training Loss: 0.7908889055252075, Test Loss: 0.7605511546134949\n", "Epoch 121/3000, Training Loss: 0.7878970503807068, Test Loss: 0.7570456862449646\n", "Epoch 122/3000, Training Loss: 0.7849118113517761, Test Loss: 0.7535527944564819\n", "Epoch 123/3000, Training Loss: 0.7819339632987976, Test Loss: 0.7500731945037842\n", "Epoch 124/3000, Training Loss: 0.7789635062217712, Test Loss: 0.7466075420379639\n", "Epoch 125/3000, Training Loss: 0.7760012745857239, Test Loss: 0.7431507110595703\n", "Epoch 126/3000, Training Loss: 0.7730463147163391, Test Loss: 0.739703893661499\n", "Epoch 127/3000, Training Loss: 0.7701001763343811, Test Loss: 0.7362736463546753\n", "Epoch 128/3000, Training Loss: 0.7671636343002319, Test Loss: 0.7328619956970215\n", "Epoch 129/3000, Training Loss: 0.76423579454422, Test Loss: 0.7294690012931824\n", "Epoch 130/3000, Training Loss: 0.761320948600769, Test Loss: 0.7260921597480774\n", "Epoch 131/3000, Training Loss: 0.7584152221679688, Test Loss: 0.722731351852417\n", "Epoch 132/3000, Training Loss: 0.7555182576179504, Test Loss: 0.7193871736526489\n", "Epoch 133/3000, Training Loss: 0.7526304125785828, Test Loss: 0.7160594463348389\n", "Epoch 134/3000, Training Loss: 0.7497516870498657, Test Loss: 0.7127485275268555\n", "Epoch 135/3000, Training Loss: 0.7468825578689575, Test Loss: 0.7094549536705017\n", "Epoch 136/3000, Training Loss: 0.7440232634544373, Test Loss: 0.7061728835105896\n", "Epoch 137/3000, Training Loss: 0.741173505783081, Test Loss: 0.702909529209137\n", "Epoch 138/3000, Training Loss: 0.7383340001106262, Test Loss: 0.6996648907661438\n", "Epoch 139/3000, Training Loss: 0.7355108857154846, Test Loss: 0.6964280605316162\n", "Epoch 140/3000, Training Loss: 0.732700765132904, Test Loss: 0.6932008266448975\n", "Epoch 141/3000, Training Loss: 0.7299015522003174, Test Loss: 0.6899845600128174\n", "Epoch 142/3000, Training Loss: 0.7271143794059753, Test Loss: 0.6867812871932983\n", "Epoch 143/3000, Training Loss: 0.7243384718894958, Test Loss: 0.6835925579071045\n", "Epoch 144/3000, Training Loss: 0.7215744256973267, Test Loss: 0.680419385433197\n", "Epoch 145/3000, Training Loss: 0.7188223004341125, Test Loss: 0.6772632002830505\n", "Epoch 146/3000, Training Loss: 0.7160822153091431, Test Loss: 0.6741251349449158\n", "Epoch 147/3000, Training Loss: 0.7133548259735107, Test Loss: 0.6710095405578613\n", "Epoch 148/3000, Training Loss: 0.7106412053108215, Test Loss: 0.6679168939590454\n", "Epoch 149/3000, Training Loss: 0.7079405188560486, Test Loss: 0.6648474335670471\n", "Epoch 150/3000, Training Loss: 0.7052527070045471, Test Loss: 0.6618013381958008\n", "Epoch 151/3000, Training Loss: 0.7025780081748962, Test Loss: 0.6587784886360168\n", "Epoch 152/3000, Training Loss: 0.699916422367096, Test Loss: 0.6557788252830505\n", "Epoch 153/3000, Training Loss: 0.6972699165344238, Test Loss: 0.6528059840202332\n", "Epoch 154/3000, Training Loss: 0.6946367621421814, Test Loss: 0.6498591303825378\n", "Epoch 155/3000, Training Loss: 0.6920168399810791, Test Loss: 0.6469374299049377\n", "Epoch 156/3000, Training Loss: 0.6894100904464722, Test Loss: 0.6440398693084717\n", "Epoch 157/3000, Training Loss: 0.6868168115615845, Test Loss: 0.6411654949188232\n", "Epoch 158/3000, Training Loss: 0.6842387318611145, Test Loss: 0.6383076310157776\n", "Epoch 159/3000, Training Loss: 0.6816824078559875, Test Loss: 0.6354721784591675\n", "Epoch 160/3000, Training Loss: 0.67914217710495, Test Loss: 0.6326581239700317\n", "Epoch 161/3000, Training Loss: 0.6766226291656494, Test Loss: 0.6298591494560242\n", "Epoch 162/3000, Training Loss: 0.6741190552711487, Test Loss: 0.6270809173583984\n", "Epoch 163/3000, Training Loss: 0.6716302037239075, Test Loss: 0.6243233680725098\n", "Epoch 164/3000, Training Loss: 0.6691560745239258, Test Loss: 0.6215801239013672\n", "Epoch 165/3000, Training Loss: 0.6666977405548096, Test Loss: 0.618851363658905\n", "Epoch 166/3000, Training Loss: 0.6642550230026245, Test Loss: 0.6161375641822815\n", "Epoch 167/3000, Training Loss: 0.6618281006813049, Test Loss: 0.6134392023086548\n", "Epoch 168/3000, Training Loss: 0.6594170928001404, Test Loss: 0.610756516456604\n", "Epoch 169/3000, Training Loss: 0.6570225954055786, Test Loss: 0.6080876588821411\n", "Epoch 170/3000, Training Loss: 0.654643714427948, Test Loss: 0.6054328680038452\n", "Epoch 171/3000, Training Loss: 0.6522803902626038, Test Loss: 0.6027927398681641\n", "Epoch 172/3000, Training Loss: 0.6499325633049011, Test Loss: 0.6001682281494141\n", "Epoch 173/3000, Training Loss: 0.6476002931594849, Test Loss: 0.5975599884986877\n", "Epoch 174/3000, Training Loss: 0.6452842950820923, Test Loss: 0.594978928565979\n", "Epoch 175/3000, Training Loss: 0.642984926700592, Test Loss: 0.5924243330955505\n", "Epoch 176/3000, Training Loss: 0.640701174736023, Test Loss: 0.5898958444595337\n", "Epoch 177/3000, Training Loss: 0.6384350061416626, Test Loss: 0.5873886346817017\n", "Epoch 178/3000, Training Loss: 0.6361857652664185, Test Loss: 0.5849032402038574\n", "Epoch 179/3000, Training Loss: 0.6339529156684875, Test Loss: 0.5824393033981323\n", "Epoch 180/3000, Training Loss: 0.6317370533943176, Test Loss: 0.5799894332885742\n", "Epoch 181/3000, Training Loss: 0.6295368075370789, Test Loss: 0.5775541663169861\n", "Epoch 182/3000, Training Loss: 0.6273518800735474, Test Loss: 0.5751343369483948\n", "Epoch 183/3000, Training Loss: 0.6251829266548157, Test Loss: 0.572737455368042\n", "Epoch 184/3000, Training Loss: 0.6230303049087524, Test Loss: 0.5703631043434143\n", "Epoch 185/3000, Training Loss: 0.6208931803703308, Test Loss: 0.5680111646652222\n", "Epoch 186/3000, Training Loss: 0.6187717914581299, Test Loss: 0.5656812787055969\n", "Epoch 187/3000, Training Loss: 0.6166658401489258, Test Loss: 0.5633736252784729\n", "Epoch 188/3000, Training Loss: 0.6145753264427185, Test Loss: 0.5610872507095337\n", "Epoch 189/3000, Training Loss: 0.6125001907348633, Test Loss: 0.5588231682777405\n", "Epoch 190/3000, Training Loss: 0.6104397177696228, Test Loss: 0.5565811395645142\n", "Epoch 191/3000, Training Loss: 0.6083943843841553, Test Loss: 0.5543587803840637\n", "Epoch 192/3000, Training Loss: 0.6063636541366577, Test Loss: 0.5521563291549683\n", "Epoch 193/3000, Training Loss: 0.6043475270271301, Test Loss: 0.5499751567840576\n", "Epoch 194/3000, Training Loss: 0.6023457050323486, Test Loss: 0.5478122234344482\n", "Epoch 195/3000, Training Loss: 0.6003578305244446, Test Loss: 0.5456671118736267\n", "Epoch 196/3000, Training Loss: 0.5983838438987732, Test Loss: 0.5435387492179871\n", "Epoch 197/3000, Training Loss: 0.5964234471321106, Test Loss: 0.5414267182350159\n", "Epoch 198/3000, Training Loss: 0.594475507736206, Test Loss: 0.5393324494361877\n", "Epoch 199/3000, Training Loss: 0.5925391912460327, Test Loss: 0.5372565388679504\n", "Epoch 200/3000, Training Loss: 0.590614914894104, Test Loss: 0.5351982712745667\n", "Epoch 201/3000, Training Loss: 0.5887026190757751, Test Loss: 0.5331564545631409\n", "Epoch 202/3000, Training Loss: 0.5868020057678223, Test Loss: 0.5311312079429626\n", "Epoch 203/3000, Training Loss: 0.5849129557609558, Test Loss: 0.5291212201118469\n", "Epoch 204/3000, Training Loss: 0.583035409450531, Test Loss: 0.5271257758140564\n", "Epoch 205/3000, Training Loss: 0.5811728239059448, Test Loss: 0.5251418352127075\n", "Epoch 206/3000, Training Loss: 0.5793242454528809, Test Loss: 0.5231691598892212\n", "Epoch 207/3000, Training Loss: 0.5774867534637451, Test Loss: 0.5212034583091736\n", "Epoch 208/3000, Training Loss: 0.5756751298904419, Test Loss: 0.5192412734031677\n", "Epoch 209/3000, Training Loss: 0.573876142501831, Test Loss: 0.5172839164733887\n", "Epoch 210/3000, Training Loss: 0.5720877647399902, Test Loss: 0.5153349041938782\n", "Epoch 211/3000, Training Loss: 0.5703102350234985, Test Loss: 0.5133949518203735\n", "Epoch 212/3000, Training Loss: 0.5685436129570007, Test Loss: 0.5114641785621643\n", "Epoch 213/3000, Training Loss: 0.5667911767959595, Test Loss: 0.509550154209137\n", "Epoch 214/3000, Training Loss: 0.5650521516799927, Test Loss: 0.5076536536216736\n", "Epoch 215/3000, Training Loss: 0.5633244514465332, Test Loss: 0.5057750940322876\n", "Epoch 216/3000, Training Loss: 0.5616083145141602, Test Loss: 0.5039141178131104\n", "Epoch 217/3000, Training Loss: 0.5599036812782288, Test Loss: 0.5020708441734314\n", "Epoch 218/3000, Training Loss: 0.5582104921340942, Test Loss: 0.5002452731132507\n", "Epoch 219/3000, Training Loss: 0.5565288066864014, Test Loss: 0.4984370470046997\n", "Epoch 220/3000, Training Loss: 0.5548584461212158, Test Loss: 0.49664607644081116\n", "Epoch 221/3000, Training Loss: 0.5531992316246033, Test Loss: 0.49487173557281494\n", "Epoch 222/3000, Training Loss: 0.5515509247779846, Test Loss: 0.49311381578445435\n", "Epoch 223/3000, Training Loss: 0.5499134063720703, Test Loss: 0.4913715720176697\n", "Epoch 224/3000, Training Loss: 0.5482866168022156, Test Loss: 0.4896446764469147\n", "Epoch 225/3000, Training Loss: 0.5466700196266174, Test Loss: 0.48793238401412964\n", "Epoch 226/3000, Training Loss: 0.5450636148452759, Test Loss: 0.48623448610305786\n", "Epoch 227/3000, Training Loss: 0.5434672236442566, Test Loss: 0.4845515489578247\n", "Epoch 228/3000, Training Loss: 0.5418806672096252, Test Loss: 0.4828818142414093\n", "Epoch 229/3000, Training Loss: 0.540303647518158, Test Loss: 0.48122477531433105\n", "Epoch 230/3000, Training Loss: 0.53873610496521, Test Loss: 0.47957998514175415\n", "Epoch 231/3000, Training Loss: 0.5371779799461365, Test Loss: 0.47794729471206665\n", "Epoch 232/3000, Training Loss: 0.5356288552284241, Test Loss: 0.4763295352458954\n", "Epoch 233/3000, Training Loss: 0.5340887308120728, Test Loss: 0.4747253358364105\n", "Epoch 234/3000, Training Loss: 0.5325575470924377, Test Loss: 0.4731316566467285\n", "Epoch 235/3000, Training Loss: 0.5310348868370056, Test Loss: 0.4715481996536255\n", "Epoch 236/3000, Training Loss: 0.5295207500457764, Test Loss: 0.4699745774269104\n", "Epoch 237/3000, Training Loss: 0.5280149579048157, Test Loss: 0.4684102237224579\n", "Epoch 238/3000, Training Loss: 0.5265172123908997, Test Loss: 0.4668549597263336\n", "Epoch 239/3000, Training Loss: 0.5250304937362671, Test Loss: 0.465310662984848\n", "Epoch 240/3000, Training Loss: 0.5235521197319031, Test Loss: 0.46377691626548767\n", "Epoch 241/3000, Training Loss: 0.5220813751220703, Test Loss: 0.46225300431251526\n", "Epoch 242/3000, Training Loss: 0.5206182599067688, Test Loss: 0.4607388377189636\n", "Epoch 243/3000, Training Loss: 0.5191624760627747, Test Loss: 0.459234356880188\n", "Epoch 244/3000, Training Loss: 0.5177139639854431, Test Loss: 0.4577382504940033\n", "Epoch 245/3000, Training Loss: 0.5162725448608398, Test Loss: 0.4562501907348633\n", "Epoch 246/3000, Training Loss: 0.5148379802703857, Test Loss: 0.4547698497772217\n", "Epoch 247/3000, Training Loss: 0.5134103298187256, Test Loss: 0.45329686999320984\n", "Epoch 248/3000, Training Loss: 0.5119892358779907, Test Loss: 0.4518308937549591\n", "Epoch 249/3000, Training Loss: 0.5105746984481812, Test Loss: 0.4503719210624695\n", "Epoch 250/3000, Training Loss: 0.5091665387153625, Test Loss: 0.44891947507858276\n", "Epoch 251/3000, Training Loss: 0.5077651739120483, Test Loss: 0.447475790977478\n", "Epoch 252/3000, Training Loss: 0.5063707828521729, Test Loss: 0.446047306060791\n", "Epoch 253/3000, Training Loss: 0.5049821138381958, Test Loss: 0.44462844729423523\n", "Epoch 254/3000, Training Loss: 0.5035991668701172, Test Loss: 0.44321736693382263\n", "Epoch 255/3000, Training Loss: 0.5022220611572266, Test Loss: 0.4418143332004547\n", "Epoch 256/3000, Training Loss: 0.5008504986763, Test Loss: 0.44043150544166565\n", "Epoch 257/3000, Training Loss: 0.49948421120643616, Test Loss: 0.43906188011169434\n", "Epoch 258/3000, Training Loss: 0.498123437166214, Test Loss: 0.43769922852516174\n", "Epoch 259/3000, Training Loss: 0.4967751204967499, Test Loss: 0.4363456666469574\n", "Epoch 260/3000, Training Loss: 0.49543359875679016, Test Loss: 0.4350006580352783\n", "Epoch 261/3000, Training Loss: 0.4940975308418274, Test Loss: 0.43366357684135437\n", "Epoch 262/3000, Training Loss: 0.49276697635650635, Test Loss: 0.4323417544364929\n", "Epoch 263/3000, Training Loss: 0.49144184589385986, Test Loss: 0.4310329556465149\n", "Epoch 264/3000, Training Loss: 0.4901219606399536, Test Loss: 0.42973271012306213\n", "Epoch 265/3000, Training Loss: 0.48880746960639954, Test Loss: 0.42843881249427795\n", "Epoch 266/3000, Training Loss: 0.4874981641769409, Test Loss: 0.42715081572532654\n", "Epoch 267/3000, Training Loss: 0.4862006604671478, Test Loss: 0.42586761713027954\n", "Epoch 268/3000, Training Loss: 0.48491495847702026, Test Loss: 0.42458614706993103\n", "Epoch 269/3000, Training Loss: 0.4836416244506836, Test Loss: 0.4233039319515228\n", "Epoch 270/3000, Training Loss: 0.48237287998199463, Test Loss: 0.42202192544937134\n", "Epoch 271/3000, Training Loss: 0.48110875487327576, Test Loss: 0.4207410216331482\n", "Epoch 272/3000, Training Loss: 0.4798493981361389, Test Loss: 0.4194600284099579\n", "Epoch 273/3000, Training Loss: 0.4785948693752289, Test Loss: 0.4181823432445526\n", "Epoch 274/3000, Training Loss: 0.47734522819519043, Test Loss: 0.41690880060195923\n", "Epoch 275/3000, Training Loss: 0.4761005640029907, Test Loss: 0.4156401455402374\n", "Epoch 276/3000, Training Loss: 0.4748608469963074, Test Loss: 0.4143771231174469\n", "Epoch 277/3000, Training Loss: 0.4736262559890747, Test Loss: 0.413120299577713\n", "Epoch 278/3000, Training Loss: 0.4723988473415375, Test Loss: 0.4118700623512268\n", "Epoch 279/3000, Training Loss: 0.47117874026298523, Test Loss: 0.4106268584728241\n", "Epoch 280/3000, Training Loss: 0.4699631631374359, Test Loss: 0.4093908667564392\n", "Epoch 281/3000, Training Loss: 0.4687522351741791, Test Loss: 0.40816229581832886\n", "Epoch 282/3000, Training Loss: 0.4675459861755371, Test Loss: 0.40694132447242737\n", "Epoch 283/3000, Training Loss: 0.4663444459438324, Test Loss: 0.40572798252105713\n", "Epoch 284/3000, Training Loss: 0.46514761447906494, Test Loss: 0.4045223891735077\n", "Epoch 285/3000, Training Loss: 0.4639555811882019, Test Loss: 0.4033243656158447\n", "Epoch 286/3000, Training Loss: 0.4627682864665985, Test Loss: 0.40213456749916077\n", "Epoch 287/3000, Training Loss: 0.4615857005119324, Test Loss: 0.4009527564048767\n", "Epoch 288/3000, Training Loss: 0.4604078531265259, Test Loss: 0.3997780382633209\n", "Epoch 289/3000, Training Loss: 0.4592345058917999, Test Loss: 0.398610383272171\n", "Epoch 290/3000, Training Loss: 0.4580658972263336, Test Loss: 0.3974494934082031\n", "Epoch 291/3000, Training Loss: 0.45690181851387024, Test Loss: 0.3962950110435486\n", "Epoch 292/3000, Training Loss: 0.45574215054512024, Test Loss: 0.39514678716659546\n", "Epoch 293/3000, Training Loss: 0.4545869827270508, Test Loss: 0.3940044641494751\n", "Epoch 294/3000, Training Loss: 0.45343708992004395, Test Loss: 0.39286789298057556\n", "Epoch 295/3000, Training Loss: 0.4522905945777893, Test Loss: 0.3917369842529297\n", "Epoch 296/3000, Training Loss: 0.45114803314208984, Test Loss: 0.39061132073402405\n", "Epoch 297/3000, Training Loss: 0.4500103294849396, Test Loss: 0.38949060440063477\n", "Epoch 298/3000, Training Loss: 0.4488776624202728, Test Loss: 0.38837191462516785\n", "Epoch 299/3000, Training Loss: 0.44774922728538513, Test Loss: 0.38725611567497253\n", "Epoch 300/3000, Training Loss: 0.4466243386268616, Test Loss: 0.38614246249198914\n", "Epoch 301/3000, Training Loss: 0.44550302624702454, Test Loss: 0.38503116369247437\n", "Epoch 302/3000, Training Loss: 0.4443855285644531, Test Loss: 0.38392263650894165\n", "Epoch 303/3000, Training Loss: 0.4432716965675354, Test Loss: 0.3828170597553253\n", "Epoch 304/3000, Training Loss: 0.44216257333755493, Test Loss: 0.38171738386154175\n", "Epoch 305/3000, Training Loss: 0.44105803966522217, Test Loss: 0.380623996257782\n", "Epoch 306/3000, Training Loss: 0.4399562180042267, Test Loss: 0.3795374631881714\n", "Epoch 307/3000, Training Loss: 0.4388578236103058, Test Loss: 0.37845662236213684\n", "Epoch 308/3000, Training Loss: 0.4377630650997162, Test Loss: 0.3773811459541321\n", "Epoch 309/3000, Training Loss: 0.436671644449234, Test Loss: 0.37630829215049744\n", "Epoch 310/3000, Training Loss: 0.4355846047401428, Test Loss: 0.3752383291721344\n", "Epoch 311/3000, Training Loss: 0.43450087308883667, Test Loss: 0.3741714358329773\n", "Epoch 312/3000, Training Loss: 0.4334205090999603, Test Loss: 0.37310782074928284\n", "Epoch 313/3000, Training Loss: 0.4323435425758362, Test Loss: 0.3720477521419525\n", "Epoch 314/3000, Training Loss: 0.4312697947025299, Test Loss: 0.3709913492202759\n", "Epoch 315/3000, Training Loss: 0.43019944429397583, Test Loss: 0.36993885040283203\n", "Epoch 316/3000, Training Loss: 0.42913269996643066, Test Loss: 0.36889269948005676\n", "Epoch 317/3000, Training Loss: 0.42806893587112427, Test Loss: 0.3678526282310486\n", "Epoch 318/3000, Training Loss: 0.4270082712173462, Test Loss: 0.3668161630630493\n", "Epoch 319/3000, Training Loss: 0.42595112323760986, Test Loss: 0.36578115820884705\n", "Epoch 320/3000, Training Loss: 0.4248974323272705, Test Loss: 0.3647482097148895\n", "Epoch 321/3000, Training Loss: 0.42384642362594604, Test Loss: 0.3637177050113678\n", "Epoch 322/3000, Training Loss: 0.422798216342926, Test Loss: 0.3626919388771057\n", "Epoch 323/3000, Training Loss: 0.4217533469200134, Test Loss: 0.3616708517074585\n", "Epoch 324/3000, Training Loss: 0.42071133852005005, Test Loss: 0.3606545627117157\n", "Epoch 325/3000, Training Loss: 0.4196721911430359, Test Loss: 0.359642893075943\n", "Epoch 326/3000, Training Loss: 0.41863584518432617, Test Loss: 0.3586357831954956\n", "Epoch 327/3000, Training Loss: 0.4176022410392761, Test Loss: 0.35763365030288696\n", "Epoch 328/3000, Training Loss: 0.41657164692878723, Test Loss: 0.35663360357284546\n", "Epoch 329/3000, Training Loss: 0.41554373502731323, Test Loss: 0.35563552379608154\n", "Epoch 330/3000, Training Loss: 0.41451844573020935, Test Loss: 0.3546416163444519\n", "Epoch 331/3000, Training Loss: 0.4134970009326935, Test Loss: 0.35365167260169983\n", "Epoch 332/3000, Training Loss: 0.4124782979488373, Test Loss: 0.35266563296318054\n", "Epoch 333/3000, Training Loss: 0.41146227717399597, Test Loss: 0.35168343782424927\n", "Epoch 334/3000, Training Loss: 0.4104488790035248, Test Loss: 0.3507055640220642\n", "Epoch 335/3000, Training Loss: 0.4094381034374237, Test Loss: 0.3497312366962433\n", "Epoch 336/3000, Training Loss: 0.40842995047569275, Test Loss: 0.34876036643981934\n", "Epoch 337/3000, Training Loss: 0.4074246883392334, Test Loss: 0.3477902114391327\n", "Epoch 338/3000, Training Loss: 0.4064215123653412, Test Loss: 0.3468210697174072\n", "Epoch 339/3000, Training Loss: 0.40542086958885193, Test Loss: 0.34585559368133545\n", "Epoch 340/3000, Training Loss: 0.404422789812088, Test Loss: 0.3448942005634308\n", "Epoch 341/3000, Training Loss: 0.403428852558136, Test Loss: 0.34393632411956787\n", "Epoch 342/3000, Training Loss: 0.4024377167224884, Test Loss: 0.34298187494277954\n", "Epoch 343/3000, Training Loss: 0.40144893527030945, Test Loss: 0.3420308232307434\n", "Epoch 344/3000, Training Loss: 0.40046265721321106, Test Loss: 0.3410830795764923\n", "Epoch 345/3000, Training Loss: 0.3994790017604828, Test Loss: 0.3401379883289337\n", "Epoch 346/3000, Training Loss: 0.3984982371330261, Test Loss: 0.3391955494880676\n", "Epoch 347/3000, Training Loss: 0.3975197672843933, Test Loss: 0.3382556438446045\n", "Epoch 348/3000, Training Loss: 0.39654362201690674, Test Loss: 0.3373182415962219\n", "Epoch 349/3000, Training Loss: 0.3955698013305664, Test Loss: 0.33638086915016174\n", "Epoch 350/3000, Training Loss: 0.39459896087646484, Test Loss: 0.3354460597038269\n", "Epoch 351/3000, Training Loss: 0.39363065361976624, Test Loss: 0.33451521396636963\n", "Epoch 352/3000, Training Loss: 0.3926646113395691, Test Loss: 0.3335879147052765\n", "Epoch 353/3000, Training Loss: 0.39170077443122864, Test Loss: 0.3326645493507385\n", "Epoch 354/3000, Training Loss: 0.3907390236854553, Test Loss: 0.3317444622516632\n", "Epoch 355/3000, Training Loss: 0.38977932929992676, Test Loss: 0.33082783222198486\n", "Epoch 356/3000, Training Loss: 0.38882163166999817, Test Loss: 0.3299136459827423\n", "Epoch 357/3000, Training Loss: 0.38786640763282776, Test Loss: 0.32900184392929077\n", "Epoch 358/3000, Training Loss: 0.3869132995605469, Test Loss: 0.3280911147594452\n", "Epoch 359/3000, Training Loss: 0.3859630227088928, Test Loss: 0.3271844983100891\n", "Epoch 360/3000, Training Loss: 0.3850145936012268, Test Loss: 0.326282262802124\n", "Epoch 361/3000, Training Loss: 0.38406795263290405, Test Loss: 0.32538387179374695\n", "Epoch 362/3000, Training Loss: 0.38312336802482605, Test Loss: 0.32449015974998474\n", "Epoch 363/3000, Training Loss: 0.38218048214912415, Test Loss: 0.3236021101474762\n", "Epoch 364/3000, Training Loss: 0.38123926520347595, Test Loss: 0.3227175772190094\n", "Epoch 365/3000, Training Loss: 0.38029974699020386, Test Loss: 0.32183629274368286\n", "Epoch 366/3000, Training Loss: 0.3793649673461914, Test Loss: 0.3209547698497772\n", "Epoch 367/3000, Training Loss: 0.3784320652484894, Test Loss: 0.3200705349445343\n", "Epoch 368/3000, Training Loss: 0.37750041484832764, Test Loss: 0.31918415427207947\n", "Epoch 369/3000, Training Loss: 0.3765704333782196, Test Loss: 0.3182949125766754\n", "Epoch 370/3000, Training Loss: 0.3756426274776459, Test Loss: 0.3174073398113251\n", "Epoch 371/3000, Training Loss: 0.3747168183326721, Test Loss: 0.31652185320854187\n", "Epoch 372/3000, Training Loss: 0.37379249930381775, Test Loss: 0.3156389892101288\n", "Epoch 373/3000, Training Loss: 0.3728697597980499, Test Loss: 0.31475886702537537\n", "Epoch 374/3000, Training Loss: 0.37194862961769104, Test Loss: 0.31388115882873535\n", "Epoch 375/3000, Training Loss: 0.37102916836738586, Test Loss: 0.31300607323646545\n", "Epoch 376/3000, Training Loss: 0.3701114356517792, Test Loss: 0.31213343143463135\n", "Epoch 377/3000, Training Loss: 0.3691954016685486, Test Loss: 0.3112633228302002\n", "Epoch 378/3000, Training Loss: 0.3682817220687866, Test Loss: 0.3103930354118347\n", "Epoch 379/3000, Training Loss: 0.3673698306083679, Test Loss: 0.3095241189002991\n", "Epoch 380/3000, Training Loss: 0.36646074056625366, Test Loss: 0.3086550533771515\n", "Epoch 381/3000, Training Loss: 0.3655557632446289, Test Loss: 0.30778875946998596\n", "Epoch 382/3000, Training Loss: 0.3646518588066101, Test Loss: 0.30692562460899353\n", "Epoch 383/3000, Training Loss: 0.36374905705451965, Test Loss: 0.3060661852359772\n", "Epoch 384/3000, Training Loss: 0.36284777522087097, Test Loss: 0.30520933866500854\n", "Epoch 385/3000, Training Loss: 0.3619481921195984, Test Loss: 0.30435553193092346\n", "Epoch 386/3000, Training Loss: 0.36105087399482727, Test Loss: 0.30350813269615173\n", "Epoch 387/3000, Training Loss: 0.3601551949977875, Test Loss: 0.3026646077632904\n", "Epoch 388/3000, Training Loss: 0.3592606484889984, Test Loss: 0.3018249273300171\n", "Epoch 389/3000, Training Loss: 0.35836872458457947, Test Loss: 0.30099213123321533\n", "Epoch 390/3000, Training Loss: 0.3574793338775635, Test Loss: 0.3001626431941986\n", "Epoch 391/3000, Training Loss: 0.35659098625183105, Test Loss: 0.2993367910385132\n", "Epoch 392/3000, Training Loss: 0.3557034134864807, Test Loss: 0.29851365089416504\n", "Epoch 393/3000, Training Loss: 0.35481709241867065, Test Loss: 0.2976928949356079\n", "Epoch 394/3000, Training Loss: 0.3539341390132904, Test Loss: 0.29687201976776123\n", "Epoch 395/3000, Training Loss: 0.3530537486076355, Test Loss: 0.2960534691810608\n", "Epoch 396/3000, Training Loss: 0.3521752953529358, Test Loss: 0.2952366769313812\n", "Epoch 397/3000, Training Loss: 0.3512986898422241, Test Loss: 0.29441991448402405\n", "Epoch 398/3000, Training Loss: 0.35042333602905273, Test Loss: 0.2936031222343445\n", "Epoch 399/3000, Training Loss: 0.34954920411109924, Test Loss: 0.29278624057769775\n", "Epoch 400/3000, Training Loss: 0.3486766517162323, Test Loss: 0.2919667065143585\n", "Epoch 401/3000, Training Loss: 0.34780704975128174, Test Loss: 0.291147381067276\n", "Epoch 402/3000, Training Loss: 0.34693944454193115, Test Loss: 0.2903297543525696\n", "Epoch 403/3000, Training Loss: 0.3460729420185089, Test Loss: 0.28951677680015564\n", "Epoch 404/3000, Training Loss: 0.34520775079727173, Test Loss: 0.28870803117752075\n", "Epoch 405/3000, Training Loss: 0.3443436920642853, Test Loss: 0.2879031002521515\n", "Epoch 406/3000, Training Loss: 0.34348127245903015, Test Loss: 0.28709661960601807\n", "Epoch 407/3000, Training Loss: 0.34262144565582275, Test Loss: 0.2862914502620697\n", "Epoch 408/3000, Training Loss: 0.34176385402679443, Test Loss: 0.28548598289489746\n", "Epoch 409/3000, Training Loss: 0.3409081995487213, Test Loss: 0.28468233346939087\n", "Epoch 410/3000, Training Loss: 0.3400542736053467, Test Loss: 0.2838857173919678\n", "Epoch 411/3000, Training Loss: 0.3392023742198944, Test Loss: 0.28309085965156555\n", "Epoch 412/3000, Training Loss: 0.33835190534591675, Test Loss: 0.2822977304458618\n", "Epoch 413/3000, Training Loss: 0.33750319480895996, Test Loss: 0.2815038859844208\n", "Epoch 414/3000, Training Loss: 0.33665645122528076, Test Loss: 0.2807134985923767\n", "Epoch 415/3000, Training Loss: 0.3358117341995239, Test Loss: 0.2799263894557953\n", "Epoch 416/3000, Training Loss: 0.3349684774875641, Test Loss: 0.27914246916770935\n", "Epoch 417/3000, Training Loss: 0.3341268301010132, Test Loss: 0.2783643901348114\n", "Epoch 418/3000, Training Loss: 0.3332870602607727, Test Loss: 0.2775864005088806\n", "Epoch 419/3000, Training Loss: 0.33244916796684265, Test Loss: 0.27681082487106323\n", "Epoch 420/3000, Training Loss: 0.33161258697509766, Test Loss: 0.27603811025619507\n", "Epoch 421/3000, Training Loss: 0.3307779133319855, Test Loss: 0.27526533603668213\n", "Epoch 422/3000, Training Loss: 0.32994502782821655, Test Loss: 0.2744952142238617\n", "Epoch 423/3000, Training Loss: 0.32911375164985657, Test Loss: 0.2737276554107666\n", "Epoch 424/3000, Training Loss: 0.32828402519226074, Test Loss: 0.2729652523994446\n", "Epoch 425/3000, Training Loss: 0.3274563252925873, Test Loss: 0.2722010910511017\n", "Epoch 426/3000, Training Loss: 0.32663029432296753, Test Loss: 0.2714352309703827\n", "Epoch 427/3000, Training Loss: 0.3258056342601776, Test Loss: 0.2706689238548279\n", "Epoch 428/3000, Training Loss: 0.32498323917388916, Test Loss: 0.2699079215526581\n", "Epoch 429/3000, Training Loss: 0.32416197657585144, Test Loss: 0.2691517770290375\n", "Epoch 430/3000, Training Loss: 0.32334285974502563, Test Loss: 0.2683941125869751\n", "Epoch 431/3000, Training Loss: 0.3225255012512207, Test Loss: 0.26763495802879333\n", "Epoch 432/3000, Training Loss: 0.32170987129211426, Test Loss: 0.26687631011009216\n", "Epoch 433/3000, Training Loss: 0.32089558243751526, Test Loss: 0.2661181390285492\n", "Epoch 434/3000, Training Loss: 0.3200828731060028, Test Loss: 0.2653616964817047\n", "Epoch 435/3000, Training Loss: 0.3192720115184784, Test Loss: 0.2646123468875885\n", "Epoch 436/3000, Training Loss: 0.31846293807029724, Test Loss: 0.2638648450374603\n", "Epoch 437/3000, Training Loss: 0.3176557719707489, Test Loss: 0.26311570405960083\n", "Epoch 438/3000, Training Loss: 0.3168501853942871, Test Loss: 0.2623670995235443\n", "Epoch 439/3000, Training Loss: 0.31604743003845215, Test Loss: 0.26162174344062805\n", "Epoch 440/3000, Training Loss: 0.31524658203125, Test Loss: 0.2608798146247864\n", "Epoch 441/3000, Training Loss: 0.3144480884075165, Test Loss: 0.2601439356803894\n", "Epoch 442/3000, Training Loss: 0.31365063786506653, Test Loss: 0.2594139873981476\n", "Epoch 443/3000, Training Loss: 0.3128548562526703, Test Loss: 0.2586841285228729\n", "Epoch 444/3000, Training Loss: 0.3120613694190979, Test Loss: 0.25795453786849976\n", "Epoch 445/3000, Training Loss: 0.3112694025039673, Test Loss: 0.2572253942489624\n", "Epoch 446/3000, Training Loss: 0.31047916412353516, Test Loss: 0.25649958848953247\n", "Epoch 447/3000, Training Loss: 0.3096907436847687, Test Loss: 0.25577735900878906\n", "Epoch 448/3000, Training Loss: 0.30890414118766785, Test Loss: 0.255056232213974\n", "Epoch 449/3000, Training Loss: 0.3081197738647461, Test Loss: 0.2543419301509857\n", "Epoch 450/3000, Training Loss: 0.3073364198207855, Test Loss: 0.25363433361053467\n", "Epoch 451/3000, Training Loss: 0.30655568838119507, Test Loss: 0.2529265880584717\n", "Epoch 452/3000, Training Loss: 0.30577653646469116, Test Loss: 0.2522185146808624\n", "Epoch 453/3000, Training Loss: 0.30499938130378723, Test Loss: 0.251512736082077\n", "Epoch 454/3000, Training Loss: 0.30422380566596985, Test Loss: 0.2508088946342468\n", "Epoch 455/3000, Training Loss: 0.30344998836517334, Test Loss: 0.25010669231414795\n", "Epoch 456/3000, Training Loss: 0.3026781380176544, Test Loss: 0.24940261244773865\n", "Epoch 457/3000, Training Loss: 0.30190789699554443, Test Loss: 0.24869763851165771\n", "Epoch 458/3000, Training Loss: 0.3011394739151001, Test Loss: 0.24799418449401855\n", "Epoch 459/3000, Training Loss: 0.30037274956703186, Test Loss: 0.24729023873806\n", "Epoch 460/3000, Training Loss: 0.29960814118385315, Test Loss: 0.24658890068531036\n", "Epoch 461/3000, Training Loss: 0.29884523153305054, Test Loss: 0.245890274643898\n", "Epoch 462/3000, Training Loss: 0.2980843782424927, Test Loss: 0.24519798159599304\n", "Epoch 463/3000, Training Loss: 0.2973252534866333, Test Loss: 0.2445114701986313\n", "Epoch 464/3000, Training Loss: 0.296568363904953, Test Loss: 0.24382829666137695\n", "Epoch 465/3000, Training Loss: 0.2958140969276428, Test Loss: 0.24314528703689575\n", "Epoch 466/3000, Training Loss: 0.2950618863105774, Test Loss: 0.24246186017990112\n", "Epoch 467/3000, Training Loss: 0.29431143403053284, Test Loss: 0.24177782237529755\n", "Epoch 468/3000, Training Loss: 0.2935626208782196, Test Loss: 0.24109318852424622\n", "Epoch 469/3000, Training Loss: 0.292815625667572, Test Loss: 0.24041277170181274\n", "Epoch 470/3000, Training Loss: 0.29207083582878113, Test Loss: 0.23973414301872253\n", "Epoch 471/3000, Training Loss: 0.2913278341293335, Test Loss: 0.23905713856220245\n", "Epoch 472/3000, Training Loss: 0.29058656096458435, Test Loss: 0.23838172852993011\n", "Epoch 473/3000, Training Loss: 0.28984716534614563, Test Loss: 0.23770785331726074\n", "Epoch 474/3000, Training Loss: 0.2891095280647278, Test Loss: 0.23703546822071075\n", "Epoch 475/3000, Training Loss: 0.28837382793426514, Test Loss: 0.23636670410633087\n", "Epoch 476/3000, Training Loss: 0.28763988614082336, Test Loss: 0.23569926619529724\n", "Epoch 477/3000, Training Loss: 0.286907821893692, Test Loss: 0.23503324389457703\n", "Epoch 478/3000, Training Loss: 0.28617769479751587, Test Loss: 0.23436559736728668\n", "Epoch 479/3000, Training Loss: 0.28544944524765015, Test Loss: 0.233697310090065\n", "Epoch 480/3000, Training Loss: 0.28472328186035156, Test Loss: 0.2330312281847\n", "Epoch 481/3000, Training Loss: 0.2839987277984619, Test Loss: 0.2323656529188156\n", "Epoch 482/3000, Training Loss: 0.28327614068984985, Test Loss: 0.23170384764671326\n", "Epoch 483/3000, Training Loss: 0.28255555033683777, Test Loss: 0.23104611039161682\n", "Epoch 484/3000, Training Loss: 0.2818368673324585, Test Loss: 0.23039177060127258\n", "Epoch 485/3000, Training Loss: 0.28112006187438965, Test Loss: 0.22973884642124176\n", "Epoch 486/3000, Training Loss: 0.2804051339626312, Test Loss: 0.2290850579738617\n", "Epoch 487/3000, Training Loss: 0.2796920835971832, Test Loss: 0.22843356430530548\n", "Epoch 488/3000, Training Loss: 0.2789810299873352, Test Loss: 0.227787047624588\n", "Epoch 489/3000, Training Loss: 0.2782716751098633, Test Loss: 0.227145254611969\n", "Epoch 490/3000, Training Loss: 0.2775641679763794, Test Loss: 0.22650766372680664\n", "Epoch 491/3000, Training Loss: 0.27685868740081787, Test Loss: 0.2258726954460144\n", "Epoch 492/3000, Training Loss: 0.27615514397621155, Test Loss: 0.22523745894432068\n", "Epoch 493/3000, Training Loss: 0.27545371651649475, Test Loss: 0.2245999127626419\n", "Epoch 494/3000, Training Loss: 0.2747541069984436, Test Loss: 0.2239658087491989\n", "Epoch 495/3000, Training Loss: 0.2740565836429596, Test Loss: 0.22333478927612305\n", "Epoch 496/3000, Training Loss: 0.27336084842681885, Test Loss: 0.22270670533180237\n", "Epoch 497/3000, Training Loss: 0.2726670205593109, Test Loss: 0.22208130359649658\n", "Epoch 498/3000, Training Loss: 0.2719751000404358, Test Loss: 0.2214556634426117\n", "Epoch 499/3000, Training Loss: 0.2712850272655487, Test Loss: 0.2208300232887268\n", "Epoch 500/3000, Training Loss: 0.27059680223464966, Test Loss: 0.2202071100473404\n", "Epoch 501/3000, Training Loss: 0.26991066336631775, Test Loss: 0.21958445012569427\n", "Epoch 502/3000, Training Loss: 0.2692264914512634, Test Loss: 0.21896423399448395\n", "Epoch 503/3000, Training Loss: 0.2685440480709076, Test Loss: 0.21834668517112732\n", "Epoch 504/3000, Training Loss: 0.26786354184150696, Test Loss: 0.2177327424287796\n", "Epoch 505/3000, Training Loss: 0.26718512177467346, Test Loss: 0.21711993217468262\n", "Epoch 506/3000, Training Loss: 0.2665086090564728, Test Loss: 0.21651098132133484\n", "Epoch 507/3000, Training Loss: 0.26583391427993774, Test Loss: 0.21590571105480194\n", "Epoch 508/3000, Training Loss: 0.2651611268520355, Test Loss: 0.21530133485794067\n", "Epoch 509/3000, Training Loss: 0.2644902169704437, Test Loss: 0.21469774842262268\n", "Epoch 510/3000, Training Loss: 0.26382172107696533, Test Loss: 0.21409675478935242\n", "Epoch 511/3000, Training Loss: 0.2631547451019287, Test Loss: 0.2134983390569687\n", "Epoch 512/3000, Training Loss: 0.2624894082546234, Test Loss: 0.21290045976638794\n", "Epoch 513/3000, Training Loss: 0.26182645559310913, Test Loss: 0.2123035490512848\n", "Epoch 514/3000, Training Loss: 0.26116541028022766, Test Loss: 0.21170780062675476\n", "Epoch 515/3000, Training Loss: 0.26050612330436707, Test Loss: 0.21111346781253815\n", "Epoch 516/3000, Training Loss: 0.2598488926887512, Test Loss: 0.21052296459674835\n", "Epoch 517/3000, Training Loss: 0.2591935694217682, Test Loss: 0.20993611216545105\n", "Epoch 518/3000, Training Loss: 0.25854039192199707, Test Loss: 0.20935434103012085\n", "Epoch 519/3000, Training Loss: 0.2578889727592468, Test Loss: 0.20877701044082642\n", "Epoch 520/3000, Training Loss: 0.25723928213119507, Test Loss: 0.20820151269435883\n", "Epoch 521/3000, Training Loss: 0.25659191608428955, Test Loss: 0.20762480795383453\n", "Epoch 522/3000, Training Loss: 0.2559463381767273, Test Loss: 0.207044318318367\n", "Epoch 523/3000, Training Loss: 0.25530263781547546, Test Loss: 0.2064632773399353\n", "Epoch 524/3000, Training Loss: 0.2546607553958893, Test Loss: 0.20588424801826477\n", "Epoch 525/3000, Training Loss: 0.2540211081504822, Test Loss: 0.20530565083026886\n", "Epoch 526/3000, Training Loss: 0.2533833384513855, Test Loss: 0.20473074913024902\n", "Epoch 527/3000, Training Loss: 0.25274762511253357, Test Loss: 0.20416155457496643\n", "Epoch 528/3000, Training Loss: 0.2521135210990906, Test Loss: 0.2035977840423584\n", "Epoch 529/3000, Training Loss: 0.2514815628528595, Test Loss: 0.2030370831489563\n", "Epoch 530/3000, Training Loss: 0.25085148215293884, Test Loss: 0.2024790346622467\n", "Epoch 531/3000, Training Loss: 0.2502232789993286, Test Loss: 0.20192039012908936\n", "Epoch 532/3000, Training Loss: 0.24959711730480194, Test Loss: 0.20136135816574097\n", "Epoch 533/3000, Training Loss: 0.2489728480577469, Test Loss: 0.2008020281791687\n", "Epoch 534/3000, Training Loss: 0.2483503669500351, Test Loss: 0.20024269819259644\n", "Epoch 535/3000, Training Loss: 0.2477301061153412, Test Loss: 0.19968543946743011\n", "Epoch 536/3000, Training Loss: 0.24711155891418457, Test Loss: 0.1991329789161682\n", "Epoch 537/3000, Training Loss: 0.24649500846862793, Test Loss: 0.1985825002193451\n", "Epoch 538/3000, Training Loss: 0.24588023126125336, Test Loss: 0.19803498685359955\n", "Epoch 539/3000, Training Loss: 0.24526764452457428, Test Loss: 0.19749045372009277\n", "Epoch 540/3000, Training Loss: 0.24465683102607727, Test Loss: 0.19694873690605164\n", "Epoch 541/3000, Training Loss: 0.24404799938201904, Test Loss: 0.1964070200920105\n", "Epoch 542/3000, Training Loss: 0.24344101548194885, Test Loss: 0.19586551189422607\n", "Epoch 543/3000, Training Loss: 0.24283625185489655, Test Loss: 0.19532303512096405\n", "Epoch 544/3000, Training Loss: 0.24223323166370392, Test Loss: 0.19478526711463928\n", "Epoch 545/3000, Training Loss: 0.2416321337223053, Test Loss: 0.194252148270607\n", "Epoch 546/3000, Training Loss: 0.2410329431295395, Test Loss: 0.1937217265367508\n", "Epoch 547/3000, Training Loss: 0.24043579399585724, Test Loss: 0.19319385290145874\n", "Epoch 548/3000, Training Loss: 0.23984059691429138, Test Loss: 0.19266586005687714\n", "Epoch 549/3000, Training Loss: 0.23924720287322998, Test Loss: 0.19213788211345673\n", "Epoch 550/3000, Training Loss: 0.23865559697151184, Test Loss: 0.1916075050830841\n", "Epoch 551/3000, Training Loss: 0.2380661964416504, Test Loss: 0.19108229875564575\n", "Epoch 552/3000, Training Loss: 0.23747873306274414, Test Loss: 0.19056211411952972\n", "Epoch 553/3000, Training Loss: 0.2368929535150528, Test Loss: 0.19004666805267334\n", "Epoch 554/3000, Training Loss: 0.2363090068101883, Test Loss: 0.189533069729805\n", "Epoch 555/3000, Training Loss: 0.2357272356748581, Test Loss: 0.18901948630809784\n", "Epoch 556/3000, Training Loss: 0.23514734208583832, Test Loss: 0.18850566446781158\n", "Epoch 557/3000, Training Loss: 0.23456934094429016, Test Loss: 0.18799461424350739\n", "Epoch 558/3000, Training Loss: 0.23399317264556885, Test Loss: 0.1874837875366211\n", "Epoch 559/3000, Training Loss: 0.233418807387352, Test Loss: 0.18697334825992584\n", "Epoch 560/3000, Training Loss: 0.23284630477428436, Test Loss: 0.18646584451198578\n", "Epoch 561/3000, Training Loss: 0.232276052236557, Test Loss: 0.18596290051937103\n", "Epoch 562/3000, Training Loss: 0.2317076325416565, Test Loss: 0.18546395003795624\n", "Epoch 563/3000, Training Loss: 0.23114100098609924, Test Loss: 0.18496611714363098\n", "Epoch 564/3000, Training Loss: 0.23057614266872406, Test Loss: 0.1844693124294281\n", "Epoch 565/3000, Training Loss: 0.23001311719417572, Test Loss: 0.18397314846515656\n", "Epoch 566/3000, Training Loss: 0.229451984167099, Test Loss: 0.1834789216518402\n", "Epoch 567/3000, Training Loss: 0.228892982006073, Test Loss: 0.18298664689064026\n", "Epoch 568/3000, Training Loss: 0.22833582758903503, Test Loss: 0.18249395489692688\n", "Epoch 569/3000, Training Loss: 0.22778046131134033, Test Loss: 0.1820012480020523\n", "Epoch 570/3000, Training Loss: 0.22722679376602173, Test Loss: 0.1815088987350464\n", "Epoch 571/3000, Training Loss: 0.2266753613948822, Test Loss: 0.1810213029384613\n", "Epoch 572/3000, Training Loss: 0.2261257916688919, Test Loss: 0.18053820729255676\n", "Epoch 573/3000, Training Loss: 0.22557790577411652, Test Loss: 0.18005937337875366\n", "Epoch 574/3000, Training Loss: 0.2250318080186844, Test Loss: 0.17958423495292664\n", "Epoch 575/3000, Training Loss: 0.22448748350143433, Test Loss: 0.17911222577095032\n", "Epoch 576/3000, Training Loss: 0.2239452600479126, Test Loss: 0.17863887548446655\n", "Epoch 577/3000, Training Loss: 0.22340494394302368, Test Loss: 0.17816418409347534\n", "Epoch 578/3000, Training Loss: 0.22286634147167206, Test Loss: 0.17768855392932892\n", "Epoch 579/3000, Training Loss: 0.2223295122385025, Test Loss: 0.17721232771873474\n", "Epoch 580/3000, Training Loss: 0.22179463505744934, Test Loss: 0.17674003541469574\n", "Epoch 581/3000, Training Loss: 0.22126168012619019, Test Loss: 0.1762716919183731\n", "Epoch 582/3000, Training Loss: 0.2207304835319519, Test Loss: 0.17580708861351013\n", "Epoch 583/3000, Training Loss: 0.2202010601758957, Test Loss: 0.17534610629081726\n", "Epoch 584/3000, Training Loss: 0.21967343986034393, Test Loss: 0.17488624155521393\n", "Epoch 585/3000, Training Loss: 0.21914777159690857, Test Loss: 0.17442747950553894\n", "Epoch 586/3000, Training Loss: 0.21862396597862244, Test Loss: 0.17396904528141022\n", "Epoch 587/3000, Training Loss: 0.21810197830200195, Test Loss: 0.1735111027956009\n", "Epoch 588/3000, Training Loss: 0.2175818681716919, Test Loss: 0.17305241525173187\n", "Epoch 589/3000, Training Loss: 0.21706344187259674, Test Loss: 0.17259377241134644\n", "Epoch 590/3000, Training Loss: 0.21654701232910156, Test Loss: 0.17213940620422363\n", "Epoch 591/3000, Training Loss: 0.21603232622146606, Test Loss: 0.17168955504894257\n", "Epoch 592/3000, Training Loss: 0.21551941335201263, Test Loss: 0.17124392092227936\n", "Epoch 593/3000, Training Loss: 0.2150082141160965, Test Loss: 0.17080220580101013\n", "Epoch 594/3000, Training Loss: 0.21449890732765198, Test Loss: 0.17036011815071106\n", "Epoch 595/3000, Training Loss: 0.21399138867855072, Test Loss: 0.16991771757602692\n", "Epoch 596/3000, Training Loss: 0.21348558366298676, Test Loss: 0.16947664320468903\n", "Epoch 597/3000, Training Loss: 0.21298164129257202, Test Loss: 0.16903692483901978\n", "Epoch 598/3000, Training Loss: 0.21247945725917816, Test Loss: 0.16860079765319824\n", "Epoch 599/3000, Training Loss: 0.2119792103767395, Test Loss: 0.16816654801368713\n", "Epoch 600/3000, Training Loss: 0.21148058772087097, Test Loss: 0.16773396730422974\n", "Epoch 601/3000, Training Loss: 0.21098366379737854, Test Loss: 0.16730433702468872\n", "Epoch 602/3000, Training Loss: 0.2104886770248413, Test Loss: 0.16687509417533875\n", "Epoch 603/3000, Training Loss: 0.20999540388584137, Test Loss: 0.16644622385501862\n", "Epoch 604/3000, Training Loss: 0.20950381457805634, Test Loss: 0.16601790487766266\n", "Epoch 605/3000, Training Loss: 0.20901401340961456, Test Loss: 0.16559109091758728\n", "Epoch 606/3000, Training Loss: 0.20852597057819366, Test Loss: 0.1651674062013626\n", "Epoch 607/3000, Training Loss: 0.20803967118263245, Test Loss: 0.1647452861070633\n", "Epoch 608/3000, Training Loss: 0.2075551450252533, Test Loss: 0.1643240749835968\n", "Epoch 609/3000, Training Loss: 0.20707236230373383, Test Loss: 0.16390395164489746\n", "Epoch 610/3000, Training Loss: 0.20659129321575165, Test Loss: 0.16348721086978912\n", "Epoch 611/3000, Training Loss: 0.20611192286014557, Test Loss: 0.16307368874549866\n", "Epoch 612/3000, Training Loss: 0.20563443005084991, Test Loss: 0.16265968978405\n", "Epoch 613/3000, Training Loss: 0.20515848696231842, Test Loss: 0.16224531829357147\n", "Epoch 614/3000, Training Loss: 0.20468425750732422, Test Loss: 0.16183243691921234\n", "Epoch 615/3000, Training Loss: 0.20421183109283447, Test Loss: 0.16142328083515167\n", "Epoch 616/3000, Training Loss: 0.2037411332130432, Test Loss: 0.16101764142513275\n", "Epoch 617/3000, Training Loss: 0.2032720148563385, Test Loss: 0.16061519086360931\n", "Epoch 618/3000, Training Loss: 0.20280461013317108, Test Loss: 0.16021545231342316\n", "Epoch 619/3000, Training Loss: 0.20233897864818573, Test Loss: 0.1598159223794937\n", "Epoch 620/3000, Training Loss: 0.20187504589557648, Test Loss: 0.15941506624221802\n", "Epoch 621/3000, Training Loss: 0.20141270756721497, Test Loss: 0.15901324152946472\n", "Epoch 622/3000, Training Loss: 0.20095208287239075, Test Loss: 0.15861229598522186\n", "Epoch 623/3000, Training Loss: 0.20049315690994263, Test Loss: 0.1582145392894745\n", "Epoch 624/3000, Training Loss: 0.2000359296798706, Test Loss: 0.15782013535499573\n", "Epoch 625/3000, Training Loss: 0.19958026707172394, Test Loss: 0.15742886066436768\n", "Epoch 626/3000, Training Loss: 0.19912631809711456, Test Loss: 0.15704044699668884\n", "Epoch 627/3000, Training Loss: 0.1986740380525589, Test Loss: 0.15665262937545776\n", "Epoch 628/3000, Training Loss: 0.19822338223457336, Test Loss: 0.15626533329486847\n", "Epoch 629/3000, Training Loss: 0.19777441024780273, Test Loss: 0.15587732195854187\n", "Epoch 630/3000, Training Loss: 0.19732701778411865, Test Loss: 0.15549039840698242\n", "Epoch 631/3000, Training Loss: 0.1968812495470047, Test Loss: 0.15510684251785278\n", "Epoch 632/3000, Training Loss: 0.19643718004226685, Test Loss: 0.15472514927387238\n", "Epoch 633/3000, Training Loss: 0.1959947645664215, Test Loss: 0.15434664487838745\n", "Epoch 634/3000, Training Loss: 0.1955539435148239, Test Loss: 0.153968945145607\n", "Epoch 635/3000, Training Loss: 0.19511473178863525, Test Loss: 0.1535920649766922\n", "Epoch 636/3000, Training Loss: 0.19467714428901672, Test Loss: 0.15321603417396545\n", "Epoch 637/3000, Training Loss: 0.19424112141132355, Test Loss: 0.15284299850463867\n", "Epoch 638/3000, Training Loss: 0.1938067376613617, Test Loss: 0.15247268974781036\n", "Epoch 639/3000, Training Loss: 0.193373903632164, Test Loss: 0.15210488438606262\n", "Epoch 640/3000, Training Loss: 0.19294267892837524, Test Loss: 0.15173733234405518\n", "Epoch 641/3000, Training Loss: 0.1925131380558014, Test Loss: 0.151368647813797\n", "Epoch 642/3000, Training Loss: 0.19208504259586334, Test Loss: 0.15100057423114777\n", "Epoch 643/3000, Training Loss: 0.19165858626365662, Test Loss: 0.15063335001468658\n", "Epoch 644/3000, Training Loss: 0.19123373925685883, Test Loss: 0.1502692848443985\n", "Epoch 645/3000, Training Loss: 0.190810427069664, Test Loss: 0.14990830421447754\n", "Epoch 646/3000, Training Loss: 0.19038864970207214, Test Loss: 0.14955027401447296\n", "Epoch 647/3000, Training Loss: 0.18996848165988922, Test Loss: 0.14919480681419373\n", "Epoch 648/3000, Training Loss: 0.18954987823963165, Test Loss: 0.1488383412361145\n", "Epoch 649/3000, Training Loss: 0.18913275003433228, Test Loss: 0.1484822779893875\n", "Epoch 650/3000, Training Loss: 0.18871721625328064, Test Loss: 0.1481267362833023\n", "Epoch 651/3000, Training Loss: 0.18830321729183197, Test Loss: 0.14777188003063202\n", "Epoch 652/3000, Training Loss: 0.18789072334766388, Test Loss: 0.14741812646389008\n", "Epoch 653/3000, Training Loss: 0.18747976422309875, Test Loss: 0.14706751704216003\n", "Epoch 654/3000, Training Loss: 0.18707038462162018, Test Loss: 0.14672012627124786\n", "Epoch 655/3000, Training Loss: 0.1866624802350998, Test Loss: 0.14637558162212372\n", "Epoch 656/3000, Training Loss: 0.1862560659646988, Test Loss: 0.14603359997272491\n", "Epoch 657/3000, Training Loss: 0.18585117161273956, Test Loss: 0.14569184184074402\n", "Epoch 658/3000, Training Loss: 0.1854478269815445, Test Loss: 0.14535020291805267\n", "Epoch 659/3000, Training Loss: 0.18504595756530762, Test Loss: 0.14500875771045685\n", "Epoch 660/3000, Training Loss: 0.18464556336402893, Test Loss: 0.14466771483421326\n", "Epoch 661/3000, Training Loss: 0.18424662947654724, Test Loss: 0.14432735741138458\n", "Epoch 662/3000, Training Loss: 0.18384921550750732, Test Loss: 0.14398989081382751\n", "Epoch 663/3000, Training Loss: 0.1834532916545868, Test Loss: 0.14365530014038086\n", "Epoch 664/3000, Training Loss: 0.18305879831314087, Test Loss: 0.14332342147827148\n", "Epoch 665/3000, Training Loss: 0.18266582489013672, Test Loss: 0.1429920643568039\n", "Epoch 666/3000, Training Loss: 0.18227431178092957, Test Loss: 0.1426612287759781\n", "Epoch 667/3000, Training Loss: 0.18188419938087463, Test Loss: 0.14233098924160004\n", "Epoch 668/3000, Training Loss: 0.18149559199810028, Test Loss: 0.14200150966644287\n", "Epoch 669/3000, Training Loss: 0.18110843002796173, Test Loss: 0.14167477190494537\n", "Epoch 670/3000, Training Loss: 0.1807226985692978, Test Loss: 0.1413506120443344\n", "Epoch 671/3000, Training Loss: 0.18033841252326965, Test Loss: 0.14102700352668762\n", "Epoch 672/3000, Training Loss: 0.17995555698871613, Test Loss: 0.1407039314508438\n", "Epoch 673/3000, Training Loss: 0.1795741319656372, Test Loss: 0.14038324356079102\n", "Epoch 674/3000, Training Loss: 0.1791941523551941, Test Loss: 0.1400630623102188\n", "Epoch 675/3000, Training Loss: 0.1788155436515808, Test Loss: 0.13974519073963165\n", "Epoch 676/3000, Training Loss: 0.17843839526176453, Test Loss: 0.13942760229110718\n", "Epoch 677/3000, Training Loss: 0.17806260287761688, Test Loss: 0.13911046087741852\n", "Epoch 678/3000, Training Loss: 0.17768821120262146, Test Loss: 0.1387939155101776\n", "Epoch 679/3000, Training Loss: 0.17731526494026184, Test Loss: 0.1384800374507904\n", "Epoch 680/3000, Training Loss: 0.17694368958473206, Test Loss: 0.13816866278648376\n", "Epoch 681/3000, Training Loss: 0.1765734851360321, Test Loss: 0.13785789906978607\n", "Epoch 682/3000, Training Loss: 0.17620471119880676, Test Loss: 0.13754768669605255\n", "Epoch 683/3000, Training Loss: 0.17583729326725006, Test Loss: 0.1372382789850235\n", "Epoch 684/3000, Training Loss: 0.1754712164402008, Test Loss: 0.1369316428899765\n", "Epoch 685/3000, Training Loss: 0.17510652542114258, Test Loss: 0.1366274207830429\n", "Epoch 686/3000, Training Loss: 0.1747431755065918, Test Loss: 0.1363237053155899\n", "Epoch 687/3000, Training Loss: 0.17438122630119324, Test Loss: 0.13602040708065033\n", "Epoch 688/3000, Training Loss: 0.17402060329914093, Test Loss: 0.13571768999099731\n", "Epoch 689/3000, Training Loss: 0.17366133630275726, Test Loss: 0.13541564345359802\n", "Epoch 690/3000, Training Loss: 0.17330336570739746, Test Loss: 0.1351146101951599\n", "Epoch 691/3000, Training Loss: 0.1729467511177063, Test Loss: 0.13481640815734863\n", "Epoch 692/3000, Training Loss: 0.17259153723716736, Test Loss: 0.13452090322971344\n", "Epoch 693/3000, Training Loss: 0.1722375750541687, Test Loss: 0.13422614336013794\n", "Epoch 694/3000, Training Loss: 0.17188496887683868, Test Loss: 0.13393209874629974\n", "Epoch 695/3000, Training Loss: 0.17153365910053253, Test Loss: 0.13363875448703766\n", "Epoch 696/3000, Training Loss: 0.17118366062641144, Test Loss: 0.13334619998931885\n", "Epoch 697/3000, Training Loss: 0.17083501815795898, Test Loss: 0.13305622339248657\n", "Epoch 698/3000, Training Loss: 0.17048759758472443, Test Loss: 0.1327669471502304\n", "Epoch 699/3000, Training Loss: 0.17014151811599731, Test Loss: 0.13247844576835632\n", "Epoch 700/3000, Training Loss: 0.16979673504829407, Test Loss: 0.132192462682724\n", "Epoch 701/3000, Training Loss: 0.1694532334804535, Test Loss: 0.13190703094005585\n", "Epoch 702/3000, Training Loss: 0.16911102831363678, Test Loss: 0.1316222995519638\n", "Epoch 703/3000, Training Loss: 0.16877005994319916, Test Loss: 0.13133829832077026\n", "Epoch 704/3000, Training Loss: 0.1684303879737854, Test Loss: 0.13105522096157074\n", "Epoch 705/3000, Training Loss: 0.16809198260307312, Test Loss: 0.1307748705148697\n", "Epoch 706/3000, Training Loss: 0.16775484383106232, Test Loss: 0.13049696385860443\n", "Epoch 707/3000, Training Loss: 0.16741898655891418, Test Loss: 0.1302192360162735\n", "Epoch 708/3000, Training Loss: 0.16708438098430634, Test Loss: 0.12994170188903809\n", "Epoch 709/3000, Training Loss: 0.16675104200839996, Test Loss: 0.1296645849943161\n", "Epoch 710/3000, Training Loss: 0.16641896963119507, Test Loss: 0.12938806414604187\n", "Epoch 711/3000, Training Loss: 0.16608819365501404, Test Loss: 0.12911194562911987\n", "Epoch 712/3000, Training Loss: 0.1657586395740509, Test Loss: 0.12883664667606354\n", "Epoch 713/3000, Training Loss: 0.16543030738830566, Test Loss: 0.1285625845193863\n", "Epoch 714/3000, Training Loss: 0.1651032418012619, Test Loss: 0.12829022109508514\n", "Epoch 715/3000, Training Loss: 0.16477736830711365, Test Loss: 0.12801972031593323\n", "Epoch 716/3000, Training Loss: 0.1644526869058609, Test Loss: 0.12775133550167084\n", "Epoch 717/3000, Training Loss: 0.16412928700447083, Test Loss: 0.12748503684997559\n", "Epoch 718/3000, Training Loss: 0.16380701959133148, Test Loss: 0.1272207498550415\n", "Epoch 719/3000, Training Loss: 0.16348600387573242, Test Loss: 0.12695825099945068\n", "Epoch 720/3000, Training Loss: 0.16316615045070648, Test Loss: 0.12669718265533447\n", "Epoch 721/3000, Training Loss: 0.16284748911857605, Test Loss: 0.12643733620643616\n", "Epoch 722/3000, Training Loss: 0.1625300496816635, Test Loss: 0.12617835402488708\n", "Epoch 723/3000, Training Loss: 0.16221380233764648, Test Loss: 0.12592001259326935\n", "Epoch 724/3000, Training Loss: 0.1618986874818802, Test Loss: 0.125662162899971\n", "Epoch 725/3000, Training Loss: 0.1615847647190094, Test Loss: 0.12540479004383087\n", "Epoch 726/3000, Training Loss: 0.16127203404903412, Test Loss: 0.12514786422252655\n", "Epoch 727/3000, Training Loss: 0.16096043586730957, Test Loss: 0.12489154934883118\n", "Epoch 728/3000, Training Loss: 0.16065002977848053, Test Loss: 0.12463600188493729\n", "Epoch 729/3000, Training Loss: 0.16034075617790222, Test Loss: 0.12438293546438217\n", "Epoch 730/3000, Training Loss: 0.16003265976905823, Test Loss: 0.12413070350885391\n", "Epoch 731/3000, Training Loss: 0.15972565114498138, Test Loss: 0.1238795816898346\n", "Epoch 732/3000, Training Loss: 0.15941984951496124, Test Loss: 0.12362946569919586\n", "Epoch 733/3000, Training Loss: 0.15911513566970825, Test Loss: 0.12338037043809891\n", "Epoch 734/3000, Training Loss: 0.1588115692138672, Test Loss: 0.1231323853135109\n", "Epoch 735/3000, Training Loss: 0.15850913524627686, Test Loss: 0.12288552522659302\n", "Epoch 736/3000, Training Loss: 0.15820784866809845, Test Loss: 0.12263985723257065\n", "Epoch 737/3000, Training Loss: 0.1579076200723648, Test Loss: 0.12239538878202438\n", "Epoch 738/3000, Training Loss: 0.15760856866836548, Test Loss: 0.12215209007263184\n", "Epoch 739/3000, Training Loss: 0.1573106199502945, Test Loss: 0.12190989404916763\n", "Epoch 740/3000, Training Loss: 0.15701375901699066, Test Loss: 0.12166881561279297\n", "Epoch 741/3000, Training Loss: 0.15671798586845398, Test Loss: 0.12142881006002426\n", "Epoch 742/3000, Training Loss: 0.15642331540584564, Test Loss: 0.12118975073099136\n", "Epoch 743/3000, Training Loss: 0.15612977743148804, Test Loss: 0.12095163017511368\n", "Epoch 744/3000, Training Loss: 0.155837282538414, Test Loss: 0.12071441859006882\n", "Epoch 745/3000, Training Loss: 0.1555458903312683, Test Loss: 0.12047810852527618\n", "Epoch 746/3000, Training Loss: 0.1552555412054062, Test Loss: 0.12024271488189697\n", "Epoch 747/3000, Training Loss: 0.1549663096666336, Test Loss: 0.12000823020935059\n", "Epoch 748/3000, Training Loss: 0.1546781212091446, Test Loss: 0.1197747066617012\n", "Epoch 749/3000, Training Loss: 0.15439100563526154, Test Loss: 0.11954214423894882\n", "Epoch 750/3000, Training Loss: 0.15410497784614563, Test Loss: 0.11931054294109344\n", "Epoch 751/3000, Training Loss: 0.1538199931383133, Test Loss: 0.11908003687858582\n", "Epoch 752/3000, Training Loss: 0.15353603661060333, Test Loss: 0.11885054409503937\n", "Epoch 753/3000, Training Loss: 0.15325315296649933, Test Loss: 0.11862209439277649\n", "Epoch 754/3000, Training Loss: 0.1529712975025177, Test Loss: 0.11839468777179718\n", "Epoch 755/3000, Training Loss: 0.15269050002098083, Test Loss: 0.11816832423210144\n", "Epoch 756/3000, Training Loss: 0.15241073071956635, Test Loss: 0.11794295907020569\n", "Epoch 757/3000, Training Loss: 0.15213198959827423, Test Loss: 0.11771862208843231\n", "Epoch 758/3000, Training Loss: 0.1518542766571045, Test Loss: 0.11749521642923355\n", "Epoch 759/3000, Training Loss: 0.15157760679721832, Test Loss: 0.1172727420926094\n", "Epoch 760/3000, Training Loss: 0.15130195021629333, Test Loss: 0.11705119162797928\n", "Epoch 761/3000, Training Loss: 0.15102727711200714, Test Loss: 0.11683053523302078\n", "Epoch 762/3000, Training Loss: 0.15075363218784332, Test Loss: 0.11661076545715332\n", "Epoch 763/3000, Training Loss: 0.1504809856414795, Test Loss: 0.11639193445444107\n", "Epoch 764/3000, Training Loss: 0.15020938217639923, Test Loss: 0.11617399752140045\n", "Epoch 765/3000, Training Loss: 0.14993871748447418, Test Loss: 0.11595693975687027\n", "Epoch 766/3000, Training Loss: 0.14966906607151031, Test Loss: 0.11574088037014008\n", "Epoch 767/3000, Training Loss: 0.14940041303634644, Test Loss: 0.11552571505308151\n", "Epoch 768/3000, Training Loss: 0.14913271367549896, Test Loss: 0.11531149595975876\n", "Epoch 769/3000, Training Loss: 0.14886602759361267, Test Loss: 0.11509823054075241\n", "Epoch 770/3000, Training Loss: 0.14860035479068756, Test Loss: 0.11488588899374008\n", "Epoch 771/3000, Training Loss: 0.14833562076091766, Test Loss: 0.11467451602220535\n", "Epoch 772/3000, Training Loss: 0.14807197451591492, Test Loss: 0.11446477472782135\n", "Epoch 773/3000, Training Loss: 0.14780941605567932, Test Loss: 0.11425655335187912\n", "Epoch 774/3000, Training Loss: 0.14754776656627655, Test Loss: 0.11404982209205627\n", "Epoch 775/3000, Training Loss: 0.1472870260477066, Test Loss: 0.1138443723320961\n", "Epoch 776/3000, Training Loss: 0.14702723920345306, Test Loss: 0.11364013701677322\n", "Epoch 777/3000, Training Loss: 0.14676834642887115, Test Loss: 0.11343701928853989\n", "Epoch 778/3000, Training Loss: 0.1465103179216385, Test Loss: 0.11323495209217072\n", "Epoch 779/3000, Training Loss: 0.14625324308872223, Test Loss: 0.11303383857011795\n", "Epoch 780/3000, Training Loss: 0.14599721133708954, Test Loss: 0.1128329485654831\n", "Epoch 781/3000, Training Loss: 0.14574220776557922, Test Loss: 0.11263231933116913\n", "Epoch 782/3000, Training Loss: 0.14548808336257935, Test Loss: 0.11243195831775665\n", "Epoch 783/3000, Training Loss: 0.14523489773273468, Test Loss: 0.11223199218511581\n", "Epoch 784/3000, Training Loss: 0.14498260617256165, Test Loss: 0.11203248053789139\n", "Epoch 785/3000, Training Loss: 0.14473119378089905, Test Loss: 0.11183346062898636\n", "Epoch 786/3000, Training Loss: 0.14448072016239166, Test Loss: 0.11163497716188431\n", "Epoch 787/3000, Training Loss: 0.14423112571239471, Test Loss: 0.11143709719181061\n", "Epoch 788/3000, Training Loss: 0.14398248493671417, Test Loss: 0.11124052852392197\n", "Epoch 789/3000, Training Loss: 0.14373479783535004, Test Loss: 0.11104517430067062\n", "Epoch 790/3000, Training Loss: 0.14348798990249634, Test Loss: 0.11085096746683121\n", "Epoch 791/3000, Training Loss: 0.14324209094047546, Test Loss: 0.11065783351659775\n", "Epoch 792/3000, Training Loss: 0.14299705624580383, Test Loss: 0.11046568304300308\n", "Epoch 793/3000, Training Loss: 0.14275282621383667, Test Loss: 0.11027438938617706\n", "Epoch 794/3000, Training Loss: 0.14250963926315308, Test Loss: 0.11008328944444656\n", "Epoch 795/3000, Training Loss: 0.14226727187633514, Test Loss: 0.1098923608660698\n", "Epoch 796/3000, Training Loss: 0.14202579855918884, Test Loss: 0.10970180481672287\n", "Epoch 797/3000, Training Loss: 0.1417851448059082, Test Loss: 0.1095115914940834\n", "Epoch 798/3000, Training Loss: 0.1415453404188156, Test Loss: 0.10932179540395737\n", "Epoch 799/3000, Training Loss: 0.1413065493106842, Test Loss: 0.10913318395614624\n", "Epoch 800/3000, Training Loss: 0.14106857776641846, Test Loss: 0.10894577950239182\n", "Epoch 801/3000, Training Loss: 0.14083141088485718, Test Loss: 0.10875952243804932\n", "Epoch 802/3000, Training Loss: 0.14059510827064514, Test Loss: 0.10857439786195755\n", "Epoch 803/3000, Training Loss: 0.14035966992378235, Test Loss: 0.10839035362005234\n", "Epoch 804/3000, Training Loss: 0.14012503623962402, Test Loss: 0.10820728540420532\n", "Epoch 805/3000, Training Loss: 0.13989126682281494, Test Loss: 0.10802450776100159\n", "Epoch 806/3000, Training Loss: 0.13965842127799988, Test Loss: 0.10784203559160233\n", "Epoch 807/3000, Training Loss: 0.1394263207912445, Test Loss: 0.10765991359949112\n", "Epoch 808/3000, Training Loss: 0.139195054769516, Test Loss: 0.10747816413640976\n", "Epoch 809/3000, Training Loss: 0.1389647275209427, Test Loss: 0.10729748755693436\n", "Epoch 810/3000, Training Loss: 0.13873519003391266, Test Loss: 0.10711784660816193\n", "Epoch 811/3000, Training Loss: 0.1385064721107483, Test Loss: 0.10693924874067307\n", "Epoch 812/3000, Training Loss: 0.1382785439491272, Test Loss: 0.10676161199808121\n", "Epoch 813/3000, Training Loss: 0.13805142045021057, Test Loss: 0.10658493638038635\n", "Epoch 814/3000, Training Loss: 0.1378251016139984, Test Loss: 0.1064092144370079\n", "Epoch 815/3000, Training Loss: 0.1375996470451355, Test Loss: 0.10623374581336975\n", "Epoch 816/3000, Training Loss: 0.13737501204013824, Test Loss: 0.10605861991643906\n", "Epoch 817/3000, Training Loss: 0.1371511071920395, Test Loss: 0.10588383674621582\n", "Epoch 818/3000, Training Loss: 0.1369280070066452, Test Loss: 0.10571008175611496\n", "Epoch 819/3000, Training Loss: 0.13670577108860016, Test Loss: 0.10553734749555588\n", "Epoch 820/3000, Training Loss: 0.1364842802286148, Test Loss: 0.1053655743598938\n", "Epoch 821/3000, Training Loss: 0.13626360893249512, Test Loss: 0.10519473254680634\n", "Epoch 822/3000, Training Loss: 0.1360437124967575, Test Loss: 0.10502422600984573\n", "Epoch 823/3000, Training Loss: 0.13582462072372437, Test Loss: 0.10485399514436722\n", "Epoch 824/3000, Training Loss: 0.1356062889099121, Test Loss: 0.10468476265668869\n", "Epoch 825/3000, Training Loss: 0.13538873195648193, Test Loss: 0.10451646894216537\n", "Epoch 826/3000, Training Loss: 0.13517196476459503, Test Loss: 0.10434906929731369\n", "Epoch 827/3000, Training Loss: 0.13495595753192902, Test Loss: 0.10418254137039185\n", "Epoch 828/3000, Training Loss: 0.1347406804561615, Test Loss: 0.1040162593126297\n", "Epoch 829/3000, Training Loss: 0.13452620804309845, Test Loss: 0.10385025292634964\n", "Epoch 830/3000, Training Loss: 0.1343124955892563, Test Loss: 0.10368514060974121\n", "Epoch 831/3000, Training Loss: 0.1340995728969574, Test Loss: 0.10352097451686859\n", "Epoch 832/3000, Training Loss: 0.133887380361557, Test Loss: 0.103357695043087\n", "Epoch 833/3000, Training Loss: 0.13367591798305511, Test Loss: 0.10319526493549347\n", "Epoch 834/3000, Training Loss: 0.13346518576145172, Test Loss: 0.10303361713886261\n", "Epoch 835/3000, Training Loss: 0.1332552284002304, Test Loss: 0.10287223011255264\n", "Epoch 836/3000, Training Loss: 0.1330460160970688, Test Loss: 0.10271110385656357\n", "Epoch 837/3000, Training Loss: 0.13283751904964447, Test Loss: 0.10255086421966553\n", "Epoch 838/3000, Training Loss: 0.13262978196144104, Test Loss: 0.10239143669605255\n", "Epoch 839/3000, Training Loss: 0.1324227899312973, Test Loss: 0.1022329181432724\n", "Epoch 840/3000, Training Loss: 0.13221648335456848, Test Loss: 0.10207514464855194\n", "Epoch 841/3000, Training Loss: 0.13201089203357697, Test Loss: 0.10191819071769714\n", "Epoch 842/3000, Training Loss: 0.13180607557296753, Test Loss: 0.10176142305135727\n", "Epoch 843/3000, Training Loss: 0.1316019594669342, Test Loss: 0.1016048714518547\n", "Epoch 844/3000, Training Loss: 0.13139857351779938, Test Loss: 0.1014491617679596\n", "Epoch 845/3000, Training Loss: 0.13119590282440186, Test Loss: 0.10129426419734955\n", "Epoch 846/3000, Training Loss: 0.13099390268325806, Test Loss: 0.10114017874002457\n", "Epoch 847/3000, Training Loss: 0.13079266250133514, Test Loss: 0.10098681598901749\n", "Epoch 848/3000, Training Loss: 0.13059209287166595, Test Loss: 0.10083422064781189\n", "Epoch 849/3000, Training Loss: 0.13039220869541168, Test Loss: 0.10068237781524658\n", "Epoch 850/3000, Training Loss: 0.1301930546760559, Test Loss: 0.1005312129855156\n", "Epoch 851/3000, Training Loss: 0.12999455630779266, Test Loss: 0.10038018226623535\n", "Epoch 852/3000, Training Loss: 0.12979678809642792, Test Loss: 0.10022933036088943\n", "Epoch 853/3000, Training Loss: 0.12959972023963928, Test Loss: 0.100078284740448\n", "Epoch 854/3000, Training Loss: 0.12940339744091034, Test Loss: 0.09992725402116776\n", "Epoch 855/3000, Training Loss: 0.1292077749967575, Test Loss: 0.09977654367685318\n", "Epoch 856/3000, Training Loss: 0.12901277840137482, Test Loss: 0.0996265634894371\n", "Epoch 857/3000, Training Loss: 0.12881849706172943, Test Loss: 0.0994776040315628\n", "Epoch 858/3000, Training Loss: 0.12862487137317657, Test Loss: 0.09932990372180939\n", "Epoch 859/3000, Training Loss: 0.12843194603919983, Test Loss: 0.09918362647294998\n", "Epoch 860/3000, Training Loss: 0.1282396763563156, Test Loss: 0.09903879463672638\n", "Epoch 861/3000, Training Loss: 0.12804804742336273, Test Loss: 0.09889525175094604\n", "Epoch 862/3000, Training Loss: 0.12785716354846954, Test Loss: 0.09875230491161346\n", "Epoch 863/3000, Training Loss: 0.1276668757200241, Test Loss: 0.09860976040363312\n", "Epoch 864/3000, Training Loss: 0.1274772435426712, Test Loss: 0.09846796095371246\n", "Epoch 865/3000, Training Loss: 0.12728825211524963, Test Loss: 0.09832663834095001\n", "Epoch 866/3000, Training Loss: 0.12709997594356537, Test Loss: 0.09818563610315323\n", "Epoch 867/3000, Training Loss: 0.12691231071949005, Test Loss: 0.09804484248161316\n", "Epoch 868/3000, Training Loss: 0.12672533094882965, Test Loss: 0.09790417551994324\n", "Epoch 869/3000, Training Loss: 0.126538947224617, Test Loss: 0.09776370227336884\n", "Epoch 870/3000, Training Loss: 0.12635323405265808, Test Loss: 0.09762352705001831\n", "Epoch 871/3000, Training Loss: 0.1261681467294693, Test Loss: 0.0974837988615036\n", "Epoch 872/3000, Training Loss: 0.12598370015621185, Test Loss: 0.09734467417001724\n", "Epoch 873/3000, Training Loss: 0.12579990923404694, Test Loss: 0.09720627963542938\n", "Epoch 874/3000, Training Loss: 0.12561672925949097, Test Loss: 0.09706870466470718\n", "Epoch 875/3000, Training Loss: 0.12543414533138275, Test Loss: 0.09693204611539841\n", "Epoch 876/3000, Training Loss: 0.12525220215320587, Test Loss: 0.09679625183343887\n", "Epoch 877/3000, Training Loss: 0.12507089972496033, Test Loss: 0.09666130691766739\n", "Epoch 878/3000, Training Loss: 0.12489025294780731, Test Loss: 0.0965266153216362\n", "Epoch 879/3000, Training Loss: 0.12471018731594086, Test Loss: 0.09639259427785873\n", "Epoch 880/3000, Training Loss: 0.12453074008226395, Test Loss: 0.09625918418169022\n", "Epoch 881/3000, Training Loss: 0.1243518590927124, Test Loss: 0.09612619131803513\n", "Epoch 882/3000, Training Loss: 0.12417366355657578, Test Loss: 0.09599363058805466\n", "Epoch 883/3000, Training Loss: 0.12399603426456451, Test Loss: 0.09586139023303986\n", "Epoch 884/3000, Training Loss: 0.1238190084695816, Test Loss: 0.09572950005531311\n", "Epoch 885/3000, Training Loss: 0.12364260107278824, Test Loss: 0.09559793025255203\n", "Epoch 886/3000, Training Loss: 0.12346681952476501, Test Loss: 0.09546680003404617\n", "Epoch 887/3000, Training Loss: 0.12329158931970596, Test Loss: 0.09533615410327911\n", "Epoch 888/3000, Training Loss: 0.12311697751283646, Test Loss: 0.09520607441663742\n", "Epoch 889/3000, Training Loss: 0.12294294685125351, Test Loss: 0.09507660567760468\n", "Epoch 890/3000, Training Loss: 0.12276949733495712, Test Loss: 0.09494773298501968\n", "Epoch 891/3000, Training Loss: 0.1225966364145279, Test Loss: 0.09481950849294662\n", "Epoch 892/3000, Training Loss: 0.12242437899112701, Test Loss: 0.09469194710254669\n", "Epoch 893/3000, Training Loss: 0.1222526803612709, Test Loss: 0.09456507116556168\n", "Epoch 894/3000, Training Loss: 0.12208156287670135, Test Loss: 0.09443874657154083\n", "Epoch 895/3000, Training Loss: 0.12191103398799896, Test Loss: 0.09431296586990356\n", "Epoch 896/3000, Training Loss: 0.12174109369516373, Test Loss: 0.0941876694560051\n", "Epoch 897/3000, Training Loss: 0.12157168984413147, Test Loss: 0.09406282752752304\n", "Epoch 898/3000, Training Loss: 0.12140290439128876, Test Loss: 0.09393838793039322\n", "Epoch 899/3000, Training Loss: 0.12123465538024902, Test Loss: 0.09381434321403503\n", "Epoch 900/3000, Training Loss: 0.12106698006391525, Test Loss: 0.09369063377380371\n", "Epoch 901/3000, Training Loss: 0.12089987099170685, Test Loss: 0.09356740117073059\n", "Epoch 902/3000, Training Loss: 0.12073328346014023, Test Loss: 0.0934445932507515\n", "Epoch 903/3000, Training Loss: 0.12056729942560196, Test Loss: 0.09332225471735\n", "Epoch 904/3000, Training Loss: 0.12040185183286667, Test Loss: 0.0932004451751709\n", "Epoch 905/3000, Training Loss: 0.12023698538541794, Test Loss: 0.09307914972305298\n", "Epoch 906/3000, Training Loss: 0.1200726330280304, Test Loss: 0.09295838326215744\n", "Epoch 907/3000, Training Loss: 0.11990878731012344, Test Loss: 0.09283816814422607\n", "Epoch 908/3000, Training Loss: 0.11974558234214783, Test Loss: 0.09271848201751709\n", "Epoch 909/3000, Training Loss: 0.1195828765630722, Test Loss: 0.09259936958551407\n", "Epoch 910/3000, Training Loss: 0.11942069977521896, Test Loss: 0.09248071163892746\n", "Epoch 911/3000, Training Loss: 0.1192590743303299, Test Loss: 0.09236256778240204\n", "Epoch 912/3000, Training Loss: 0.11909803003072739, Test Loss: 0.09224486351013184\n", "Epoch 913/3000, Training Loss: 0.11893745511770248, Test Loss: 0.09212761372327805\n", "Epoch 914/3000, Training Loss: 0.11877742409706116, Test Loss: 0.09201078116893768\n", "Epoch 915/3000, Training Loss: 0.1186179593205452, Test Loss: 0.09189435839653015\n", "Epoch 916/3000, Training Loss: 0.11845900118350983, Test Loss: 0.09177834540605545\n", "Epoch 917/3000, Training Loss: 0.11830056458711624, Test Loss: 0.09166279435157776\n", "Epoch 918/3000, Training Loss: 0.11814264953136444, Test Loss: 0.09154769778251648\n", "Epoch 919/3000, Training Loss: 0.11798524111509323, Test Loss: 0.09143305569887161\n", "Epoch 920/3000, Training Loss: 0.1178283840417862, Test Loss: 0.09131889045238495\n", "Epoch 921/3000, Training Loss: 0.11767201870679855, Test Loss: 0.09120526909828186\n", "Epoch 922/3000, Training Loss: 0.11751619726419449, Test Loss: 0.09109217673540115\n", "Epoch 923/3000, Training Loss: 0.11736086755990982, Test Loss: 0.09097956120967865\n", "Epoch 924/3000, Training Loss: 0.11720602959394455, Test Loss: 0.09086745232343674\n", "Epoch 925/3000, Training Loss: 0.11705175042152405, Test Loss: 0.09075579047203064\n", "Epoch 926/3000, Training Loss: 0.11689794808626175, Test Loss: 0.09064462035894394\n", "Epoch 927/3000, Training Loss: 0.11674465239048004, Test Loss: 0.09053386002779007\n", "Epoch 928/3000, Training Loss: 0.11659184843301773, Test Loss: 0.0904235690832138\n", "Epoch 929/3000, Training Loss: 0.11643953621387482, Test Loss: 0.09031370282173157\n", "Epoch 930/3000, Training Loss: 0.11628776788711548, Test Loss: 0.09020421653985977\n", "Epoch 931/3000, Training Loss: 0.11613644659519196, Test Loss: 0.09009518474340439\n", "Epoch 932/3000, Training Loss: 0.11598563939332962, Test Loss: 0.08998657017946243\n", "Epoch 933/3000, Training Loss: 0.11583533138036728, Test Loss: 0.0898783728480339\n", "Epoch 934/3000, Training Loss: 0.11568547785282135, Test Loss: 0.08977056294679642\n", "Epoch 935/3000, Training Loss: 0.1155361533164978, Test Loss: 0.08966325968503952\n", "Epoch 936/3000, Training Loss: 0.11538726836442947, Test Loss: 0.08955637365579605\n", "Epoch 937/3000, Training Loss: 0.11523890495300293, Test Loss: 0.08944994956254959\n", "Epoch 938/3000, Training Loss: 0.11509101837873459, Test Loss: 0.08934396505355835\n", "Epoch 939/3000, Training Loss: 0.11494361609220505, Test Loss: 0.08923838287591934\n", "Epoch 940/3000, Training Loss: 0.11479669064283371, Test Loss: 0.08913329243659973\n", "Epoch 941/3000, Training Loss: 0.11465021967887878, Test Loss: 0.08902859687805176\n", "Epoch 942/3000, Training Loss: 0.11450420320034027, Test Loss: 0.0889243334531784\n", "Epoch 943/3000, Training Loss: 0.11435868591070175, Test Loss: 0.08882047981023788\n", "Epoch 944/3000, Training Loss: 0.11421366035938263, Test Loss: 0.0887170061469078\n", "Epoch 945/3000, Training Loss: 0.11406905949115753, Test Loss: 0.08861391991376877\n", "Epoch 946/3000, Training Loss: 0.11392496526241302, Test Loss: 0.08851122856140137\n", "Epoch 947/3000, Training Loss: 0.11378131806850433, Test Loss: 0.08840896934270859\n", "Epoch 948/3000, Training Loss: 0.11363812536001205, Test Loss: 0.08830709010362625\n", "Epoch 949/3000, Training Loss: 0.11349540948867798, Test Loss: 0.08820562064647675\n", "Epoch 950/3000, Training Loss: 0.11335310339927673, Test Loss: 0.08810457587242126\n", "Epoch 951/3000, Training Loss: 0.11321131139993668, Test Loss: 0.0880039632320404\n", "Epoch 952/3000, Training Loss: 0.11306994408369064, Test Loss: 0.08790373802185059\n", "Epoch 953/3000, Training Loss: 0.11292902380228043, Test Loss: 0.087803915143013\n", "Epoch 954/3000, Training Loss: 0.11278858035802841, Test Loss: 0.08770446479320526\n", "Epoch 955/3000, Training Loss: 0.11264855414628983, Test Loss: 0.08760545402765274\n", "Epoch 956/3000, Training Loss: 0.11250901222229004, Test Loss: 0.08750686794519424\n", "Epoch 957/3000, Training Loss: 0.11236988008022308, Test Loss: 0.0874086543917656\n", "Epoch 958/3000, Training Loss: 0.11223121732473373, Test Loss: 0.08731083571910858\n", "Epoch 959/3000, Training Loss: 0.11209295690059662, Test Loss: 0.08721340447664261\n", "Epoch 960/3000, Training Loss: 0.1119551882147789, Test Loss: 0.0871163085103035\n", "Epoch 961/3000, Training Loss: 0.11181782931089401, Test Loss: 0.08701962977647781\n", "Epoch 962/3000, Training Loss: 0.11168088018894196, Test Loss: 0.08692332357168198\n", "Epoch 963/3000, Training Loss: 0.1115444079041481, Test Loss: 0.08682741224765778\n", "Epoch 964/3000, Training Loss: 0.11140834540128708, Test Loss: 0.08673184365034103\n", "Epoch 965/3000, Training Loss: 0.11127270013093948, Test Loss: 0.08663669228553772\n", "Epoch 966/3000, Training Loss: 0.1111375018954277, Test Loss: 0.08654193580150604\n", "Epoch 967/3000, Training Loss: 0.11100275814533234, Test Loss: 0.08644747734069824\n", "Epoch 968/3000, Training Loss: 0.11086840182542801, Test Loss: 0.08635345846414566\n", "Epoch 969/3000, Training Loss: 0.1107344850897789, Test Loss: 0.08625979721546173\n", "Epoch 970/3000, Training Loss: 0.11060097068548203, Test Loss: 0.08616649359464645\n", "Epoch 971/3000, Training Loss: 0.11046789586544037, Test Loss: 0.08607356250286102\n", "Epoch 972/3000, Training Loss: 0.11033522337675095, Test Loss: 0.08598099648952484\n", "Epoch 973/3000, Training Loss: 0.11020300537347794, Test Loss: 0.0858888030052185\n", "Epoch 974/3000, Training Loss: 0.11007115989923477, Test Loss: 0.08579692989587784\n", "Epoch 975/3000, Training Loss: 0.10993973910808563, Test Loss: 0.08570542931556702\n", "Epoch 976/3000, Training Loss: 0.10980875790119171, Test Loss: 0.08561433106660843\n", "Epoch 977/3000, Training Loss: 0.10967812687158585, Test Loss: 0.08552353829145432\n", "Epoch 978/3000, Training Loss: 0.1095479354262352, Test Loss: 0.08543315529823303\n", "Epoch 979/3000, Training Loss: 0.10941815376281738, Test Loss: 0.0853431224822998\n", "Epoch 980/3000, Training Loss: 0.109288789331913, Test Loss: 0.08525339514017105\n", "Epoch 981/3000, Training Loss: 0.10915978997945786, Test Loss: 0.08516405522823334\n", "Epoch 982/3000, Training Loss: 0.10903124511241913, Test Loss: 0.08507503569126129\n", "Epoch 983/3000, Training Loss: 0.10890307277441025, Test Loss: 0.0849863588809967\n", "Epoch 984/3000, Training Loss: 0.10877528041601181, Test Loss: 0.08489806205034256\n", "Epoch 985/3000, Training Loss: 0.1086479052901268, Test Loss: 0.0848100557923317\n", "Epoch 986/3000, Training Loss: 0.10852092504501343, Test Loss: 0.08472240716218948\n", "Epoch 987/3000, Training Loss: 0.1083943173289299, Test Loss: 0.08463513106107712\n", "Epoch 988/3000, Training Loss: 0.10826814919710159, Test Loss: 0.08454817533493042\n", "Epoch 989/3000, Training Loss: 0.10814233124256134, Test Loss: 0.08446158468723297\n", "Epoch 990/3000, Training Loss: 0.10801691561937332, Test Loss: 0.08437535166740417\n", "Epoch 991/3000, Training Loss: 0.10789189487695694, Test Loss: 0.08428940176963806\n", "Epoch 992/3000, Training Loss: 0.1077672466635704, Test Loss: 0.08420386910438538\n", "Epoch 993/3000, Training Loss: 0.10764296352863312, Test Loss: 0.08411863446235657\n", "Epoch 994/3000, Training Loss: 0.10751909762620926, Test Loss: 0.08403368294239044\n", "Epoch 995/3000, Training Loss: 0.10739558935165405, Test Loss: 0.08394906669855118\n", "Epoch 996/3000, Training Loss: 0.10727250576019287, Test Loss: 0.08386367559432983\n", "Epoch 997/3000, Training Loss: 0.10714980959892273, Test Loss: 0.08377785980701447\n", "Epoch 998/3000, Training Loss: 0.10702747106552124, Test Loss: 0.08369193971157074\n", "Epoch 999/3000, Training Loss: 0.10690552741289139, Test Loss: 0.08360634744167328\n", "Epoch 1000/3000, Training Loss: 0.10678394138813019, Test Loss: 0.08352144062519073\n", "Epoch 1001/3000, Training Loss: 0.10666275024414062, Test Loss: 0.08343751728534698\n", "Epoch 1002/3000, Training Loss: 0.10654190927743912, Test Loss: 0.08335473388433456\n", "Epoch 1003/3000, Training Loss: 0.10642144083976746, Test Loss: 0.08327310532331467\n", "Epoch 1004/3000, Training Loss: 0.10630135238170624, Test Loss: 0.08319252729415894\n", "Epoch 1005/3000, Training Loss: 0.10618162900209427, Test Loss: 0.08311271667480469\n", "Epoch 1006/3000, Training Loss: 0.10606225579977036, Test Loss: 0.08303337544202805\n", "Epoch 1007/3000, Training Loss: 0.10594325512647629, Test Loss: 0.08295425772666931\n", "Epoch 1008/3000, Training Loss: 0.10582461953163147, Test Loss: 0.0828750729560852\n", "Epoch 1009/3000, Training Loss: 0.1057063490152359, Test Loss: 0.08279567211866379\n", "Epoch 1010/3000, Training Loss: 0.10558842122554779, Test Loss: 0.08271601796150208\n", "Epoch 1011/3000, Training Loss: 0.10547086596488953, Test Loss: 0.08263615518808365\n", "Epoch 1012/3000, Training Loss: 0.10535366088151932, Test Loss: 0.08255613595247269\n", "Epoch 1013/3000, Training Loss: 0.10523682087659836, Test Loss: 0.08247622847557068\n", "Epoch 1014/3000, Training Loss: 0.10512033849954605, Test Loss: 0.08239655196666718\n", "Epoch 1015/3000, Training Loss: 0.10500416904687881, Test Loss: 0.08231737464666367\n", "Epoch 1016/3000, Training Loss: 0.10488837212324142, Test Loss: 0.08223874121904373\n", "Epoch 1017/3000, Training Loss: 0.10477294772863388, Test Loss: 0.08216079324483871\n", "Epoch 1018/3000, Training Loss: 0.104657843708992, Test Loss: 0.08208349347114563\n", "Epoch 1019/3000, Training Loss: 0.10454309731721878, Test Loss: 0.08200681954622269\n", "Epoch 1020/3000, Training Loss: 0.10442867130041122, Test Loss: 0.08193069696426392\n", "Epoch 1021/3000, Training Loss: 0.10431463271379471, Test Loss: 0.0818549245595932\n", "Epoch 1022/3000, Training Loss: 0.10420089960098267, Test Loss: 0.08177942037582397\n", "Epoch 1023/3000, Training Loss: 0.10408751666545868, Test Loss: 0.0817040354013443\n", "Epoch 1024/3000, Training Loss: 0.10397449135780334, Test Loss: 0.08162868767976761\n", "Epoch 1025/3000, Training Loss: 0.10386178642511368, Test Loss: 0.0815533697605133\n", "Epoch 1026/3000, Training Loss: 0.10374942421913147, Test Loss: 0.08147813379764557\n", "Epoch 1027/3000, Training Loss: 0.10363738238811493, Test Loss: 0.08140291273593903\n", "Epoch 1028/3000, Training Loss: 0.10352570563554764, Test Loss: 0.0813278928399086\n", "Epoch 1029/3000, Training Loss: 0.10341435670852661, Test Loss: 0.08125309646129608\n", "Epoch 1030/3000, Training Loss: 0.10330329090356827, Test Loss: 0.0811786875128746\n", "Epoch 1031/3000, Training Loss: 0.10319261997938156, Test Loss: 0.08110468834638596\n", "Epoch 1032/3000, Training Loss: 0.10308224707841873, Test Loss: 0.08103112131357193\n", "Epoch 1033/3000, Training Loss: 0.10297221690416336, Test Loss: 0.0809580385684967\n", "Epoch 1034/3000, Training Loss: 0.10286246985197067, Test Loss: 0.08088534325361252\n", "Epoch 1035/3000, Training Loss: 0.10275308042764664, Test Loss: 0.08081308752298355\n", "Epoch 1036/3000, Training Loss: 0.10264403373003006, Test Loss: 0.08074115216732025\n", "Epoch 1037/3000, Training Loss: 0.10253529995679855, Test Loss: 0.08066942542791367\n", "Epoch 1038/3000, Training Loss: 0.10242687165737152, Test Loss: 0.08059792220592499\n", "Epoch 1039/3000, Training Loss: 0.10231875628232956, Test Loss: 0.08052659779787064\n", "Epoch 1040/3000, Training Loss: 0.10221096873283386, Test Loss: 0.08045543730258942\n", "Epoch 1041/3000, Training Loss: 0.10210348665714264, Test Loss: 0.08038439601659775\n", "Epoch 1042/3000, Training Loss: 0.10199635475873947, Test Loss: 0.0803135335445404\n", "Epoch 1043/3000, Training Loss: 0.10188951343297958, Test Loss: 0.08024291694164276\n", "Epoch 1044/3000, Training Loss: 0.10178299248218536, Test Loss: 0.08017252385616302\n", "Epoch 1045/3000, Training Loss: 0.10167678445577621, Test Loss: 0.08010242134332657\n", "Epoch 1046/3000, Training Loss: 0.10157088935375214, Test Loss: 0.08003269135951996\n", "Epoch 1047/3000, Training Loss: 0.10146529972553253, Test Loss: 0.07996327430009842\n", "Epoch 1048/3000, Training Loss: 0.101360023021698, Test Loss: 0.07989421486854553\n", "Epoch 1049/3000, Training Loss: 0.10125505179166794, Test Loss: 0.0798254907131195\n", "Epoch 1050/3000, Training Loss: 0.10115040093660355, Test Loss: 0.07975707203149796\n", "Epoch 1051/3000, Training Loss: 0.10104605555534363, Test Loss: 0.0796889141201973\n", "Epoch 1052/3000, Training Loss: 0.1009419858455658, Test Loss: 0.07962105423212051\n", "Epoch 1053/3000, Training Loss: 0.10083822906017303, Test Loss: 0.07955339550971985\n", "Epoch 1054/3000, Training Loss: 0.10073477774858475, Test Loss: 0.0794859454035759\n", "Epoch 1055/3000, Training Loss: 0.10063161700963974, Test Loss: 0.07941870391368866\n", "Epoch 1056/3000, Training Loss: 0.100528784096241, Test Loss: 0.07935167104005814\n", "Epoch 1057/3000, Training Loss: 0.10042624175548553, Test Loss: 0.07928481698036194\n", "Epoch 1058/3000, Training Loss: 0.10032398253679276, Test Loss: 0.07921818643808365\n", "Epoch 1059/3000, Training Loss: 0.10022202134132385, Test Loss: 0.07915183156728745\n", "Epoch 1060/3000, Training Loss: 0.10012038052082062, Test Loss: 0.07908575981855392\n", "Epoch 1061/3000, Training Loss: 0.10001898556947708, Test Loss: 0.07901996374130249\n", "Epoch 1062/3000, Training Loss: 0.0999179258942604, Test Loss: 0.07895444333553314\n", "Epoch 1063/3000, Training Loss: 0.09981712698936462, Test Loss: 0.07888926565647125\n", "Epoch 1064/3000, Training Loss: 0.09971662610769272, Test Loss: 0.07882428169250488\n", "Epoch 1065/3000, Training Loss: 0.09961642324924469, Test Loss: 0.0787595883011818\n", "Epoch 1066/3000, Training Loss: 0.09951650351285934, Test Loss: 0.0786951556801796\n", "Epoch 1067/3000, Training Loss: 0.09941689670085907, Test Loss: 0.0786309465765953\n", "Epoch 1068/3000, Training Loss: 0.0993175357580185, Test Loss: 0.07856696844100952\n", "Epoch 1069/3000, Training Loss: 0.09921848773956299, Test Loss: 0.07850324362516403\n", "Epoch 1070/3000, Training Loss: 0.09911972284317017, Test Loss: 0.07843964546918869\n", "Epoch 1071/3000, Training Loss: 0.09902121126651764, Test Loss: 0.07837632298469543\n", "Epoch 1072/3000, Training Loss: 0.09892305731773376, Test Loss: 0.07831625640392303\n", "Epoch 1073/3000, Training Loss: 0.09882517904043198, Test Loss: 0.07825861871242523\n", "Epoch 1074/3000, Training Loss: 0.09872759133577347, Test Loss: 0.07820238173007965\n", "Epoch 1075/3000, Training Loss: 0.09863027930259705, Test Loss: 0.0781463161110878\n", "Epoch 1076/3000, Training Loss: 0.09853322058916092, Test Loss: 0.07808946073055267\n", "Epoch 1077/3000, Training Loss: 0.09843648225069046, Test Loss: 0.07803088426589966\n", "Epoch 1078/3000, Training Loss: 0.0983399748802185, Test Loss: 0.07797035574913025\n", "Epoch 1079/3000, Training Loss: 0.09824377298355103, Test Loss: 0.07790790498256683\n", "Epoch 1080/3000, Training Loss: 0.09814780205488205, Test Loss: 0.07784388959407806\n", "Epoch 1081/3000, Training Loss: 0.09805209934711456, Test Loss: 0.07777906209230423\n", "Epoch 1082/3000, Training Loss: 0.09795667976140976, Test Loss: 0.07771419733762741\n", "Epoch 1083/3000, Training Loss: 0.09786155819892883, Test Loss: 0.07765006273984909\n", "Epoch 1084/3000, Training Loss: 0.09776665270328522, Test Loss: 0.07758726924657822\n", "Epoch 1085/3000, Training Loss: 0.09767205268144608, Test Loss: 0.07752611488103867\n", "Epoch 1086/3000, Training Loss: 0.09757772088050842, Test Loss: 0.07746681571006775\n", "Epoch 1087/3000, Training Loss: 0.09748363494873047, Test Loss: 0.07740902900695801\n", "Epoch 1088/3000, Training Loss: 0.09738978743553162, Test Loss: 0.07735241204500198\n", "Epoch 1089/3000, Training Loss: 0.09729626029729843, Test Loss: 0.07729648053646088\n", "Epoch 1090/3000, Training Loss: 0.09720298647880554, Test Loss: 0.07724061608314514\n", "Epoch 1091/3000, Training Loss: 0.09710993617773056, Test Loss: 0.07718438655138016\n", "Epoch 1092/3000, Training Loss: 0.09701716899871826, Test Loss: 0.07712455093860626\n", "Epoch 1093/3000, Training Loss: 0.09692467004060745, Test Loss: 0.0770617201924324\n", "Epoch 1094/3000, Training Loss: 0.09683245420455933, Test Loss: 0.07699699699878693\n", "Epoch 1095/3000, Training Loss: 0.0967404767870903, Test Loss: 0.07693172991275787\n", "Epoch 1096/3000, Training Loss: 0.09664874523878098, Test Loss: 0.07686734199523926\n", "Epoch 1097/3000, Training Loss: 0.09655728191137314, Test Loss: 0.07680493593215942\n", "Epoch 1098/3000, Training Loss: 0.0964660793542862, Test Loss: 0.07674804329872131\n", "Epoch 1099/3000, Training Loss: 0.09637513756752014, Test Loss: 0.07669610530138016\n", "Epoch 1100/3000, Training Loss: 0.0962844043970108, Test Loss: 0.0766477957367897\n", "Epoch 1101/3000, Training Loss: 0.09619394689798355, Test Loss: 0.07660133391618729\n", "Epoch 1102/3000, Training Loss: 0.09610375761985779, Test Loss: 0.07655484229326248\n", "Epoch 1103/3000, Training Loss: 0.09601379930973053, Test Loss: 0.07650382816791534\n", "Epoch 1104/3000, Training Loss: 0.09592410922050476, Test Loss: 0.07644783705472946\n", "Epoch 1105/3000, Training Loss: 0.0958346351981163, Test Loss: 0.07638741284608841\n", "Epoch 1106/3000, Training Loss: 0.09574544429779053, Test Loss: 0.07632669061422348\n", "Epoch 1107/3000, Training Loss: 0.09565646946430206, Test Loss: 0.07626654207706451\n", "Epoch 1108/3000, Training Loss: 0.09556776285171509, Test Loss: 0.07620783895254135\n", "Epoch 1109/3000, Training Loss: 0.09547930210828781, Test Loss: 0.07615106552839279\n", "Epoch 1110/3000, Training Loss: 0.09539107978343964, Test Loss: 0.07609642297029495\n", "Epoch 1111/3000, Training Loss: 0.09530307352542877, Test Loss: 0.07604364305734634\n", "Epoch 1112/3000, Training Loss: 0.0952153429389, Test Loss: 0.07599237561225891\n", "Epoch 1113/3000, Training Loss: 0.09512782841920853, Test Loss: 0.07594189792871475\n", "Epoch 1114/3000, Training Loss: 0.09504058212041855, Test Loss: 0.07588881999254227\n", "Epoch 1115/3000, Training Loss: 0.09495353698730469, Test Loss: 0.07583324611186981\n", "Epoch 1116/3000, Training Loss: 0.09486677497625351, Test Loss: 0.07577579468488693\n", "Epoch 1117/3000, Training Loss: 0.09478020668029785, Test Loss: 0.07571732997894287\n", "Epoch 1118/3000, Training Loss: 0.09469391405582428, Test Loss: 0.07566172629594803\n", "Epoch 1119/3000, Training Loss: 0.09460784494876862, Test Loss: 0.07560907304286957\n", "Epoch 1120/3000, Training Loss: 0.09452200680971146, Test Loss: 0.0755591094493866\n", "Epoch 1121/3000, Training Loss: 0.0944363921880722, Test Loss: 0.07551111280918121\n", "Epoch 1122/3000, Training Loss: 0.09435103088617325, Test Loss: 0.0754641517996788\n", "Epoch 1123/3000, Training Loss: 0.0942658856511116, Test Loss: 0.07541722059249878\n", "Epoch 1124/3000, Training Loss: 0.09418097883462906, Test Loss: 0.07536941021680832\n", "Epoch 1125/3000, Training Loss: 0.09409632533788681, Test Loss: 0.07531740516424179\n", "Epoch 1126/3000, Training Loss: 0.0940118357539177, Test Loss: 0.0752616673707962\n", "Epoch 1127/3000, Training Loss: 0.09392763674259186, Test Loss: 0.07520608603954315\n", "Epoch 1128/3000, Training Loss: 0.09384364634752274, Test Loss: 0.07515140622854233\n", "Epoch 1129/3000, Training Loss: 0.09375987201929092, Test Loss: 0.07509812712669373\n", "Epoch 1130/3000, Training Loss: 0.09367635101079941, Test Loss: 0.07504662871360779\n", "Epoch 1131/3000, Training Loss: 0.0935930386185646, Test Loss: 0.07499682903289795\n", "Epoch 1132/3000, Training Loss: 0.09350995719432831, Test Loss: 0.07494572550058365\n", "Epoch 1133/3000, Training Loss: 0.09342709928750992, Test Loss: 0.07489362359046936\n", "Epoch 1134/3000, Training Loss: 0.09334443509578705, Test Loss: 0.07484368234872818\n", "Epoch 1135/3000, Training Loss: 0.09326203912496567, Test Loss: 0.07479570060968399\n", "Epoch 1136/3000, Training Loss: 0.09317980706691742, Test Loss: 0.07474915683269501\n", "Epoch 1137/3000, Training Loss: 0.09309783577919006, Test Loss: 0.07470066845417023\n", "Epoch 1138/3000, Training Loss: 0.09301608055830002, Test Loss: 0.07465038448572159\n", "Epoch 1139/3000, Training Loss: 0.09293454885482788, Test Loss: 0.0746011957526207\n", "Epoch 1140/3000, Training Loss: 0.09285323321819305, Test Loss: 0.07455313205718994\n", "Epoch 1141/3000, Training Loss: 0.09277214854955673, Test Loss: 0.07450592517852783\n", "Epoch 1142/3000, Training Loss: 0.09269124269485474, Test Loss: 0.07445921748876572\n", "Epoch 1143/3000, Training Loss: 0.09261056035757065, Test Loss: 0.07441264390945435\n", "Epoch 1144/3000, Training Loss: 0.09253010898828506, Test Loss: 0.07436592131853104\n", "Epoch 1145/3000, Training Loss: 0.09244988113641739, Test Loss: 0.07431882619857788\n", "Epoch 1146/3000, Training Loss: 0.09236983954906464, Test Loss: 0.07426857203245163\n", "Epoch 1147/3000, Training Loss: 0.09229002147912979, Test Loss: 0.07421594113111496\n", "Epoch 1148/3000, Training Loss: 0.09221041202545166, Test Loss: 0.07416464388370514\n", "Epoch 1149/3000, Training Loss: 0.09213102608919144, Test Loss: 0.07411529868841171\n", "Epoch 1150/3000, Training Loss: 0.09205184131860733, Test Loss: 0.07406804710626602\n", "Epoch 1151/3000, Training Loss: 0.09197285026311874, Test Loss: 0.07402268797159195\n", "Epoch 1152/3000, Training Loss: 0.09189411252737045, Test Loss: 0.07397878915071487\n", "Epoch 1153/3000, Training Loss: 0.09181555360555649, Test Loss: 0.07393310219049454\n", "Epoch 1154/3000, Training Loss: 0.09173720329999924, Test Loss: 0.07388568669557571\n", "Epoch 1155/3000, Training Loss: 0.09165903925895691, Test Loss: 0.0738394483923912\n", "Epoch 1156/3000, Training Loss: 0.09158109873533249, Test Loss: 0.07379425317049026\n", "Epoch 1157/3000, Training Loss: 0.09150337427854538, Test Loss: 0.07374989241361618\n", "Epoch 1158/3000, Training Loss: 0.0914258286356926, Test Loss: 0.07370594888925552\n", "Epoch 1159/3000, Training Loss: 0.09134852886199951, Test Loss: 0.07366213202476501\n", "Epoch 1160/3000, Training Loss: 0.09127141535282135, Test Loss: 0.073618084192276\n", "Epoch 1161/3000, Training Loss: 0.09119448065757751, Test Loss: 0.07357105612754822\n", "Epoch 1162/3000, Training Loss: 0.0911177471280098, Test Loss: 0.07352162897586823\n", "Epoch 1163/3000, Training Loss: 0.09104122221469879, Test Loss: 0.07347337156534195\n", "Epoch 1164/3000, Training Loss: 0.0909649208188057, Test Loss: 0.07342676818370819\n", "Epoch 1165/3000, Training Loss: 0.09088882803916931, Test Loss: 0.07338201254606247\n", "Epoch 1166/3000, Training Loss: 0.09081289917230606, Test Loss: 0.07333885878324509\n", "Epoch 1167/3000, Training Loss: 0.09073715656995773, Test Loss: 0.07329701632261276\n", "Epoch 1168/3000, Training Loss: 0.09066161513328552, Test Loss: 0.07325583696365356\n", "Epoch 1169/3000, Training Loss: 0.09058629721403122, Test Loss: 0.0732148289680481\n", "Epoch 1170/3000, Training Loss: 0.09051115065813065, Test Loss: 0.07317092269659042\n", "Epoch 1171/3000, Training Loss: 0.09043622016906738, Test Loss: 0.07312685996294022\n", "Epoch 1172/3000, Training Loss: 0.09036145359277725, Test Loss: 0.0730828121304512\n", "Epoch 1173/3000, Training Loss: 0.09028692543506622, Test Loss: 0.07303635776042938\n", "Epoch 1174/3000, Training Loss: 0.09021256119012833, Test Loss: 0.0729907676577568\n", "Epoch 1175/3000, Training Loss: 0.09013838320970535, Test Loss: 0.07294642925262451\n", "Epoch 1176/3000, Training Loss: 0.0900643989443779, Test Loss: 0.07290340960025787\n", "Epoch 1177/3000, Training Loss: 0.08999062329530716, Test Loss: 0.07286163419485092\n", "Epoch 1178/3000, Training Loss: 0.08991701900959015, Test Loss: 0.07282079011201859\n", "Epoch 1179/3000, Training Loss: 0.08984361588954926, Test Loss: 0.07277807593345642\n", "Epoch 1180/3000, Training Loss: 0.08977040648460388, Test Loss: 0.07273610681295395\n", "Epoch 1181/3000, Training Loss: 0.08969738334417343, Test Loss: 0.0726948156952858\n", "Epoch 1182/3000, Training Loss: 0.08962451666593552, Test Loss: 0.07265397161245346\n", "Epoch 1183/3000, Training Loss: 0.08955187350511551, Test Loss: 0.07261087745428085\n", "Epoch 1184/3000, Training Loss: 0.08947937935590744, Test Loss: 0.07256846129894257\n", "Epoch 1185/3000, Training Loss: 0.08940708637237549, Test Loss: 0.07252673804759979\n", "Epoch 1186/3000, Training Loss: 0.08933499455451965, Test Loss: 0.07248575985431671\n", "Epoch 1187/3000, Training Loss: 0.08926311135292053, Test Loss: 0.0724453330039978\n", "Epoch 1188/3000, Training Loss: 0.08919136971235275, Test Loss: 0.07240527123212814\n", "Epoch 1189/3000, Training Loss: 0.0891198143362999, Test Loss: 0.07236535102128983\n", "Epoch 1190/3000, Training Loss: 0.08904843777418137, Test Loss: 0.07232299447059631\n", "Epoch 1191/3000, Training Loss: 0.08897726237773895, Test Loss: 0.072278693318367\n", "Epoch 1192/3000, Training Loss: 0.08890623599290848, Test Loss: 0.07223569601774216\n", "Epoch 1193/3000, Training Loss: 0.08883543312549591, Test Loss: 0.07219434529542923\n", "Epoch 1194/3000, Training Loss: 0.08876477926969528, Test Loss: 0.07215471565723419\n", "Epoch 1195/3000, Training Loss: 0.08869431167840958, Test Loss: 0.07211639732122421\n", "Epoch 1196/3000, Training Loss: 0.0886240229010582, Test Loss: 0.07207910716533661\n", "Epoch 1197/3000, Training Loss: 0.08855392038822174, Test Loss: 0.07204214483499527\n", "Epoch 1198/3000, Training Loss: 0.08848398178815842, Test Loss: 0.07200504094362259\n", "Epoch 1199/3000, Training Loss: 0.08841419219970703, Test Loss: 0.07196731865406036\n", "Epoch 1200/3000, Training Loss: 0.08834463357925415, Test Loss: 0.07192878425121307\n", "Epoch 1201/3000, Training Loss: 0.0882752388715744, Test Loss: 0.07188927382230759\n", "Epoch 1202/3000, Training Loss: 0.088205985724926, Test Loss: 0.0718490332365036\n", "Epoch 1203/3000, Training Loss: 0.08813691139221191, Test Loss: 0.071805939078331\n", "Epoch 1204/3000, Training Loss: 0.08806803077459335, Test Loss: 0.07176338881254196\n", "Epoch 1205/3000, Training Loss: 0.08799931406974792, Test Loss: 0.07172202318906784\n", "Epoch 1206/3000, Training Loss: 0.08793078362941742, Test Loss: 0.07168220728635788\n", "Epoch 1207/3000, Training Loss: 0.08786240220069885, Test Loss: 0.07164406776428223\n", "Epoch 1208/3000, Training Loss: 0.08779419213533401, Test Loss: 0.0716073140501976\n", "Epoch 1209/3000, Training Loss: 0.0877261608839035, Test Loss: 0.07156927138566971\n", "Epoch 1210/3000, Training Loss: 0.08765832334756851, Test Loss: 0.07153234630823135\n", "Epoch 1211/3000, Training Loss: 0.08759064227342606, Test Loss: 0.07149627804756165\n", "Epoch 1212/3000, Training Loss: 0.08752309530973434, Test Loss: 0.07146062701940536\n", "Epoch 1213/3000, Training Loss: 0.08745574951171875, Test Loss: 0.07142502814531326\n", "Epoch 1214/3000, Training Loss: 0.08738858252763748, Test Loss: 0.07138673961162567\n", "Epoch 1215/3000, Training Loss: 0.08732152730226517, Test Loss: 0.07134851068258286\n", "Epoch 1216/3000, Training Loss: 0.08725471794605255, Test Loss: 0.07131054252386093\n", "Epoch 1217/3000, Training Loss: 0.0871879905462265, Test Loss: 0.0712728500366211\n", "Epoch 1218/3000, Training Loss: 0.08712148666381836, Test Loss: 0.07123561203479767\n", "Epoch 1219/3000, Training Loss: 0.08705512434244156, Test Loss: 0.07119881361722946\n", "Epoch 1220/3000, Training Loss: 0.08698894828557968, Test Loss: 0.07116243988275528\n", "Epoch 1221/3000, Training Loss: 0.08692288398742676, Test Loss: 0.07112636417150497\n", "Epoch 1222/3000, Training Loss: 0.08685703575611115, Test Loss: 0.07109047472476959\n", "Epoch 1223/3000, Training Loss: 0.08679131418466568, Test Loss: 0.07105465978384018\n", "Epoch 1224/3000, Training Loss: 0.08672577142715454, Test Loss: 0.07101883739233017\n", "Epoch 1225/3000, Training Loss: 0.08666037768125534, Test Loss: 0.07098295539617538\n", "Epoch 1226/3000, Training Loss: 0.08659517019987106, Test Loss: 0.07094476372003555\n", "Epoch 1227/3000, Training Loss: 0.08653008937835693, Test Loss: 0.07090713083744049\n", "Epoch 1228/3000, Training Loss: 0.08646520972251892, Test Loss: 0.07087032496929169\n", "Epoch 1229/3000, Training Loss: 0.08640044182538986, Test Loss: 0.07083452492952347\n", "Epoch 1230/3000, Training Loss: 0.08633584529161453, Test Loss: 0.07079961150884628\n", "Epoch 1231/3000, Training Loss: 0.08627142757177353, Test Loss: 0.07076550275087357\n", "Epoch 1232/3000, Training Loss: 0.08620715886354446, Test Loss: 0.07073183357715607\n", "Epoch 1233/3000, Training Loss: 0.08614303916692734, Test Loss: 0.07069829851388931\n", "Epoch 1234/3000, Training Loss: 0.08607906103134155, Test Loss: 0.07066245377063751\n", "Epoch 1235/3000, Training Loss: 0.08601526916027069, Test Loss: 0.07062691450119019\n", "Epoch 1236/3000, Training Loss: 0.08595161885023117, Test Loss: 0.07059174031019211\n", "Epoch 1237/3000, Training Loss: 0.08588811755180359, Test Loss: 0.07055699080228806\n", "Epoch 1238/3000, Training Loss: 0.08582477271556854, Test Loss: 0.0705227255821228\n", "Epoch 1239/3000, Training Loss: 0.08576160669326782, Test Loss: 0.0704888254404068\n", "Epoch 1240/3000, Training Loss: 0.08569856733083725, Test Loss: 0.07045523822307587\n", "Epoch 1241/3000, Training Loss: 0.08563568443059921, Test Loss: 0.07042180746793747\n", "Epoch 1242/3000, Training Loss: 0.08557295799255371, Test Loss: 0.07038837671279907\n", "Epoch 1243/3000, Training Loss: 0.08551036566495895, Test Loss: 0.07035491615533829\n", "Epoch 1244/3000, Training Loss: 0.08544793725013733, Test Loss: 0.07032133638858795\n", "Epoch 1245/3000, Training Loss: 0.08538568019866943, Test Loss: 0.07028767466545105\n", "Epoch 1246/3000, Training Loss: 0.0853235274553299, Test Loss: 0.0702538937330246\n", "Epoch 1247/3000, Training Loss: 0.08526155352592468, Test Loss: 0.07022014260292053\n", "Epoch 1248/3000, Training Loss: 0.08519971370697021, Test Loss: 0.07018646597862244\n", "Epoch 1249/3000, Training Loss: 0.08513806015253067, Test Loss: 0.07015295326709747\n", "Epoch 1250/3000, Training Loss: 0.0850764811038971, Test Loss: 0.07011748850345612\n", "Epoch 1251/3000, Training Loss: 0.08501512557268143, Test Loss: 0.0700829029083252\n", "Epoch 1252/3000, Training Loss: 0.08495386689901352, Test Loss: 0.07004944980144501\n", "Epoch 1253/3000, Training Loss: 0.08489279448986053, Test Loss: 0.07001712918281555\n", "Epoch 1254/3000, Training Loss: 0.08483181893825531, Test Loss: 0.06998573988676071\n", "Epoch 1255/3000, Training Loss: 0.08477100729942322, Test Loss: 0.06995506584644318\n", "Epoch 1256/3000, Training Loss: 0.08471036702394485, Test Loss: 0.06992461532354355\n", "Epoch 1257/3000, Training Loss: 0.08464984595775604, Test Loss: 0.06989407539367676\n", "Epoch 1258/3000, Training Loss: 0.08458949625492096, Test Loss: 0.06986317038536072\n", "Epoch 1259/3000, Training Loss: 0.08452925831079483, Test Loss: 0.0698317140340805\n", "Epoch 1260/3000, Training Loss: 0.08446918427944183, Test Loss: 0.06979961693286896\n", "Epoch 1261/3000, Training Loss: 0.08440923690795898, Test Loss: 0.06976709514856339\n", "Epoch 1262/3000, Training Loss: 0.08434943109750748, Test Loss: 0.06973425298929214\n", "Epoch 1263/3000, Training Loss: 0.08428977429866791, Test Loss: 0.06970132887363434\n", "Epoch 1264/3000, Training Loss: 0.08423024415969849, Test Loss: 0.06966879218816757\n", "Epoch 1265/3000, Training Loss: 0.08417090028524399, Test Loss: 0.0696367472410202\n", "Epoch 1266/3000, Training Loss: 0.08411166071891785, Test Loss: 0.06960539519786835\n", "Epoch 1267/3000, Training Loss: 0.08405258506536484, Test Loss: 0.06957471370697021\n", "Epoch 1268/3000, Training Loss: 0.08399362117052078, Test Loss: 0.06954468041658401\n", "Epoch 1269/3000, Training Loss: 0.08393481373786926, Test Loss: 0.06951501965522766\n", "Epoch 1270/3000, Training Loss: 0.08387615531682968, Test Loss: 0.06948565691709518\n", "Epoch 1271/3000, Training Loss: 0.08381762355566025, Test Loss: 0.06945633888244629\n", "Epoch 1272/3000, Training Loss: 0.08375923335552216, Test Loss: 0.06942691653966904\n", "Epoch 1273/3000, Training Loss: 0.08370094001293182, Test Loss: 0.06939727067947388\n", "Epoch 1274/3000, Training Loss: 0.08364284038543701, Test Loss: 0.06936737149953842\n", "Epoch 1275/3000, Training Loss: 0.08358482271432877, Test Loss: 0.0693371444940567\n", "Epoch 1276/3000, Training Loss: 0.08352696895599365, Test Loss: 0.06930675357580185\n", "Epoch 1277/3000, Training Loss: 0.08346925675868988, Test Loss: 0.0692763403058052\n", "Epoch 1278/3000, Training Loss: 0.08341170102357864, Test Loss: 0.06924605369567871\n", "Epoch 1279/3000, Training Loss: 0.08335424214601517, Test Loss: 0.06921592354774475\n", "Epoch 1280/3000, Training Loss: 0.08329691737890244, Test Loss: 0.0691860169172287\n", "Epoch 1281/3000, Training Loss: 0.08323971182107925, Test Loss: 0.06915764510631561\n", "Epoch 1282/3000, Training Loss: 0.08318270742893219, Test Loss: 0.06913035362958908\n", "Epoch 1283/3000, Training Loss: 0.08312580734491348, Test Loss: 0.06910370290279388\n", "Epoch 1284/3000, Training Loss: 0.08306902647018433, Test Loss: 0.06907709687948227\n", "Epoch 1285/3000, Training Loss: 0.08301238715648651, Test Loss: 0.06904997676610947\n", "Epoch 1286/3000, Training Loss: 0.08295591175556183, Test Loss: 0.06902196258306503\n", "Epoch 1287/3000, Training Loss: 0.08289948850870132, Test Loss: 0.06899286806583405\n", "Epoch 1288/3000, Training Loss: 0.08284328877925873, Test Loss: 0.06896285712718964\n", "Epoch 1289/3000, Training Loss: 0.08278714865446091, Test Loss: 0.06893215328454971\n", "Epoch 1290/3000, Training Loss: 0.08273115009069443, Test Loss: 0.06890111416578293\n", "Epoch 1291/3000, Training Loss: 0.08267530798912048, Test Loss: 0.0688701942563057\n", "Epoch 1292/3000, Training Loss: 0.08261959999799728, Test Loss: 0.0688396766781807\n", "Epoch 1293/3000, Training Loss: 0.08256398141384125, Test Loss: 0.06880991905927658\n", "Epoch 1294/3000, Training Loss: 0.08250848948955536, Test Loss: 0.06878095865249634\n", "Epoch 1295/3000, Training Loss: 0.08245314657688141, Test Loss: 0.06875276565551758\n", "Epoch 1296/3000, Training Loss: 0.08239791542291641, Test Loss: 0.06872515380382538\n", "Epoch 1297/3000, Training Loss: 0.08234284073114395, Test Loss: 0.06869799643754959\n", "Epoch 1298/3000, Training Loss: 0.08228782564401627, Test Loss: 0.06867101043462753\n", "Epoch 1299/3000, Training Loss: 0.0822330191731453, Test Loss: 0.06864387542009354\n", "Epoch 1300/3000, Training Loss: 0.08217830955982208, Test Loss: 0.06861641258001328\n", "Epoch 1301/3000, Training Loss: 0.08212371170520782, Test Loss: 0.0685885101556778\n", "Epoch 1302/3000, Training Loss: 0.08206921815872192, Test Loss: 0.0685601532459259\n", "Epoch 1303/3000, Training Loss: 0.08201488107442856, Test Loss: 0.06853149086236954\n", "Epoch 1304/3000, Training Loss: 0.08196068555116653, Test Loss: 0.06850269436836243\n", "Epoch 1305/3000, Training Loss: 0.08190656453371048, Test Loss: 0.0684739202260971\n", "Epoch 1306/3000, Training Loss: 0.08185259997844696, Test Loss: 0.0684455931186676\n", "Epoch 1307/3000, Training Loss: 0.0817987397313118, Test Loss: 0.0684177428483963\n", "Epoch 1308/3000, Training Loss: 0.08174500614404678, Test Loss: 0.0683903694152832\n", "Epoch 1309/3000, Training Loss: 0.08169137686491013, Test Loss: 0.06836336106061935\n", "Epoch 1310/3000, Training Loss: 0.081637904047966, Test Loss: 0.06833676248788834\n", "Epoch 1311/3000, Training Loss: 0.08158453553915024, Test Loss: 0.06831035763025284\n", "Epoch 1312/3000, Training Loss: 0.08153127878904343, Test Loss: 0.06828412413597107\n", "Epoch 1313/3000, Training Loss: 0.08147814124822617, Test Loss: 0.0682578757405281\n", "Epoch 1314/3000, Training Loss: 0.08142513036727905, Test Loss: 0.06823147088289261\n", "Epoch 1315/3000, Training Loss: 0.08137226104736328, Test Loss: 0.06820487231016159\n", "Epoch 1316/3000, Training Loss: 0.08131948113441467, Test Loss: 0.06817807257175446\n", "Epoch 1317/3000, Training Loss: 0.08126682043075562, Test Loss: 0.06815122812986374\n", "Epoch 1318/3000, Training Loss: 0.08121427148580551, Test Loss: 0.06812448799610138\n", "Epoch 1319/3000, Training Loss: 0.08116186410188675, Test Loss: 0.06809782981872559\n", "Epoch 1320/3000, Training Loss: 0.08110956102609634, Test Loss: 0.06807135045528412\n", "Epoch 1321/3000, Training Loss: 0.08105736970901489, Test Loss: 0.06804515421390533\n", "Epoch 1322/3000, Training Loss: 0.08100531250238419, Test Loss: 0.06801912933588028\n", "Epoch 1323/3000, Training Loss: 0.08095333725214005, Test Loss: 0.06799337267875671\n", "Epoch 1324/3000, Training Loss: 0.08090150356292725, Test Loss: 0.06796783953905106\n", "Epoch 1325/3000, Training Loss: 0.080849789083004, Test Loss: 0.06794249266386032\n", "Epoch 1326/3000, Training Loss: 0.08079815655946732, Test Loss: 0.06791730225086212\n", "Epoch 1327/3000, Training Loss: 0.08074668049812317, Test Loss: 0.06789218634366989\n", "Epoch 1328/3000, Training Loss: 0.08069529384374619, Test Loss: 0.06786704063415527\n", "Epoch 1329/3000, Training Loss: 0.08064402639865875, Test Loss: 0.06784185022115707\n", "Epoch 1330/3000, Training Loss: 0.08059288561344147, Test Loss: 0.06781662255525589\n", "Epoch 1331/3000, Training Loss: 0.08054182678461075, Test Loss: 0.06779135763645172\n", "Epoch 1332/3000, Training Loss: 0.08049090951681137, Test Loss: 0.06776607781648636\n", "Epoch 1333/3000, Training Loss: 0.08044009655714035, Test Loss: 0.06774084270000458\n", "Epoch 1334/3000, Training Loss: 0.0803893506526947, Test Loss: 0.06771575659513474\n", "Epoch 1335/3000, Training Loss: 0.08033879101276398, Test Loss: 0.06769086420536041\n", "Epoch 1336/3000, Training Loss: 0.08028829097747803, Test Loss: 0.0676662027835846\n", "Epoch 1337/3000, Training Loss: 0.08023791015148163, Test Loss: 0.0676417201757431\n", "Epoch 1338/3000, Training Loss: 0.08018767088651657, Test Loss: 0.0676175206899643\n", "Epoch 1339/3000, Training Loss: 0.08013749122619629, Test Loss: 0.06759338080883026\n", "Epoch 1340/3000, Training Loss: 0.08008748292922974, Test Loss: 0.06756927818059921\n", "Epoch 1341/3000, Training Loss: 0.08003754168748856, Test Loss: 0.06754514575004578\n", "Epoch 1342/3000, Training Loss: 0.07998768985271454, Test Loss: 0.06752101331949234\n", "Epoch 1343/3000, Training Loss: 0.07993798702955246, Test Loss: 0.06749685108661652\n", "Epoch 1344/3000, Training Loss: 0.07988837361335754, Test Loss: 0.06747282296419144\n", "Epoch 1345/3000, Training Loss: 0.07983889430761337, Test Loss: 0.06744883209466934\n", "Epoch 1346/3000, Training Loss: 0.07978948950767517, Test Loss: 0.06742497533559799\n", "Epoch 1347/3000, Training Loss: 0.07974021136760712, Test Loss: 0.06740114837884903\n", "Epoch 1348/3000, Training Loss: 0.07969101518392563, Test Loss: 0.06737737357616425\n", "Epoch 1349/3000, Training Loss: 0.07964194566011429, Test Loss: 0.06735377758741379\n", "Epoch 1350/3000, Training Loss: 0.07959297299385071, Test Loss: 0.0673302635550499\n", "Epoch 1351/3000, Training Loss: 0.07954413443803787, Test Loss: 0.06730694323778152\n", "Epoch 1352/3000, Training Loss: 0.07949536293745041, Test Loss: 0.06728367507457733\n", "Epoch 1353/3000, Training Loss: 0.07944672554731369, Test Loss: 0.0672605037689209\n", "Epoch 1354/3000, Training Loss: 0.07939817756414413, Test Loss: 0.06723734736442566\n", "Epoch 1355/3000, Training Loss: 0.07934974879026413, Test Loss: 0.06721428036689758\n", "Epoch 1356/3000, Training Loss: 0.0793013945221901, Test Loss: 0.06719131767749786\n", "Epoch 1357/3000, Training Loss: 0.0792531669139862, Test Loss: 0.0671684741973877\n", "Epoch 1358/3000, Training Loss: 0.07920504361391068, Test Loss: 0.06714561581611633\n", "Epoch 1359/3000, Training Loss: 0.0791570171713829, Test Loss: 0.06712279468774796\n", "Epoch 1360/3000, Training Loss: 0.0791090875864029, Test Loss: 0.06709997355937958\n", "Epoch 1361/3000, Training Loss: 0.07906126976013184, Test Loss: 0.06707723438739777\n", "Epoch 1362/3000, Training Loss: 0.07901354879140854, Test Loss: 0.06705456227064133\n", "Epoch 1363/3000, Training Loss: 0.0789659395813942, Test Loss: 0.06703206896781921\n", "Epoch 1364/3000, Training Loss: 0.07891839742660522, Test Loss: 0.06700964272022247\n", "Epoch 1365/3000, Training Loss: 0.07887101173400879, Test Loss: 0.06698735803365707\n", "Epoch 1366/3000, Training Loss: 0.07882367819547653, Test Loss: 0.06696508824825287\n", "Epoch 1367/3000, Training Loss: 0.07877650856971741, Test Loss: 0.06694294512271881\n", "Epoch 1368/3000, Training Loss: 0.07872937619686127, Test Loss: 0.06692083925008774\n", "Epoch 1369/3000, Training Loss: 0.07868235558271408, Test Loss: 0.06689886748790741\n", "Epoch 1370/3000, Training Loss: 0.07863543182611465, Test Loss: 0.06687693297863007\n", "Epoch 1371/3000, Training Loss: 0.07858861982822418, Test Loss: 0.06685502082109451\n", "Epoch 1372/3000, Training Loss: 0.07854191213846207, Test Loss: 0.06683314591646194\n", "Epoch 1373/3000, Training Loss: 0.07849528640508652, Test Loss: 0.06681133806705475\n", "Epoch 1374/3000, Training Loss: 0.07844878733158112, Test Loss: 0.0667896568775177\n", "Epoch 1375/3000, Training Loss: 0.07840237021446228, Test Loss: 0.06676806509494781\n", "Epoch 1376/3000, Training Loss: 0.07835602760314941, Test Loss: 0.0667465552687645\n", "Epoch 1377/3000, Training Loss: 0.0783098042011261, Test Loss: 0.06672506034374237\n", "Epoch 1378/3000, Training Loss: 0.07826367765665054, Test Loss: 0.06670355796813965\n", "Epoch 1379/3000, Training Loss: 0.07821765542030334, Test Loss: 0.06668217480182648\n", "Epoch 1380/3000, Training Loss: 0.07817170768976212, Test Loss: 0.06666090339422226\n", "Epoch 1381/3000, Training Loss: 0.07812588661909103, Test Loss: 0.0666397288441658\n", "Epoch 1382/3000, Training Loss: 0.07808011770248413, Test Loss: 0.06661870330572128\n", "Epoch 1383/3000, Training Loss: 0.07803449034690857, Test Loss: 0.06659776717424393\n", "Epoch 1384/3000, Training Loss: 0.07798895239830017, Test Loss: 0.06657684594392776\n", "Epoch 1385/3000, Training Loss: 0.07794350385665894, Test Loss: 0.06655597686767578\n", "Epoch 1386/3000, Training Loss: 0.07789813727140427, Test Loss: 0.0665351003408432\n", "Epoch 1387/3000, Training Loss: 0.07785286009311676, Test Loss: 0.0665142834186554\n", "Epoch 1388/3000, Training Loss: 0.07780769467353821, Test Loss: 0.06649347394704819\n", "Epoch 1389/3000, Training Loss: 0.07776261121034622, Test Loss: 0.06647279113531113\n", "Epoch 1390/3000, Training Loss: 0.077717624604702, Test Loss: 0.06645220518112183\n", "Epoch 1391/3000, Training Loss: 0.07767273485660553, Test Loss: 0.06643173098564148\n", "Epoch 1392/3000, Training Loss: 0.07762792706489563, Test Loss: 0.06641135364770889\n", "Epoch 1393/3000, Training Loss: 0.07758323103189468, Test Loss: 0.06639103591442108\n", "Epoch 1394/3000, Training Loss: 0.0775386169552803, Test Loss: 0.06637074053287506\n", "Epoch 1395/3000, Training Loss: 0.07749409973621368, Test Loss: 0.06635048240423203\n", "Epoch 1396/3000, Training Loss: 0.07744965702295303, Test Loss: 0.06633032113313675\n", "Epoch 1397/3000, Training Loss: 0.07740533351898193, Test Loss: 0.0663103237748146\n", "Epoch 1398/3000, Training Loss: 0.0773610845208168, Test Loss: 0.06629027426242828\n", "Epoch 1399/3000, Training Loss: 0.07731691747903824, Test Loss: 0.06627030670642853\n", "Epoch 1400/3000, Training Loss: 0.07727285474538803, Test Loss: 0.06625036150217056\n", "Epoch 1401/3000, Training Loss: 0.07722886651754379, Test Loss: 0.06623054295778275\n", "Epoch 1402/3000, Training Loss: 0.0771850049495697, Test Loss: 0.0662107989192009\n", "Epoch 1403/3000, Training Loss: 0.0771411806344986, Test Loss: 0.06619109958410263\n", "Epoch 1404/3000, Training Loss: 0.07709748297929764, Test Loss: 0.06617148220539093\n", "Epoch 1405/3000, Training Loss: 0.07705388963222504, Test Loss: 0.06615190207958221\n", "Epoch 1406/3000, Training Loss: 0.07701034843921661, Test Loss: 0.0661323219537735\n", "Epoch 1407/3000, Training Loss: 0.07696687430143356, Test Loss: 0.0661129429936409\n", "Epoch 1408/3000, Training Loss: 0.07692356407642365, Test Loss: 0.0660935714840889\n", "Epoch 1409/3000, Training Loss: 0.0768803060054779, Test Loss: 0.06607436388731003\n", "Epoch 1410/3000, Training Loss: 0.07683712989091873, Test Loss: 0.06605513393878937\n", "Epoch 1411/3000, Training Loss: 0.07679403573274612, Test Loss: 0.06603596359491348\n", "Epoch 1412/3000, Training Loss: 0.07675102353096008, Test Loss: 0.06601682305335999\n", "Epoch 1413/3000, Training Loss: 0.07670814543962479, Test Loss: 0.06599775701761246\n", "Epoch 1414/3000, Training Loss: 0.07666530460119247, Test Loss: 0.06597879528999329\n", "Epoch 1415/3000, Training Loss: 0.07662255316972733, Test Loss: 0.06595983356237411\n", "Epoch 1416/3000, Training Loss: 0.07657990604639053, Test Loss: 0.06594091653823853\n", "Epoch 1417/3000, Training Loss: 0.07653733342885971, Test Loss: 0.06592213362455368\n", "Epoch 1418/3000, Training Loss: 0.07649484276771545, Test Loss: 0.06590341031551361\n", "Epoch 1419/3000, Training Loss: 0.07645244151353836, Test Loss: 0.06588473170995712\n", "Epoch 1420/3000, Training Loss: 0.07641014456748962, Test Loss: 0.06586609780788422\n", "Epoch 1421/3000, Training Loss: 0.07636791467666626, Test Loss: 0.06584753841161728\n", "Epoch 1422/3000, Training Loss: 0.07632579654455185, Test Loss: 0.0658290907740593\n", "Epoch 1423/3000, Training Loss: 0.07628373056650162, Test Loss: 0.06581069529056549\n", "Epoch 1424/3000, Training Loss: 0.07624174654483795, Test Loss: 0.06579229235649109\n", "Epoch 1425/3000, Training Loss: 0.07619986683130264, Test Loss: 0.06577400118112564\n", "Epoch 1426/3000, Training Loss: 0.0761580690741539, Test Loss: 0.06575573980808258\n", "Epoch 1427/3000, Training Loss: 0.07611636072397232, Test Loss: 0.06573756039142609\n", "Epoch 1428/3000, Training Loss: 0.07607471197843552, Test Loss: 0.06571944057941437\n", "Epoch 1429/3000, Training Loss: 0.07603316754102707, Test Loss: 0.06570135802030563\n", "Epoch 1430/3000, Training Loss: 0.075991690158844, Test Loss: 0.06568333506584167\n", "Epoch 1431/3000, Training Loss: 0.07595028728246689, Test Loss: 0.06566530466079712\n", "Epoch 1432/3000, Training Loss: 0.07590900361537933, Test Loss: 0.0656474232673645\n", "Epoch 1433/3000, Training Loss: 0.07586775720119476, Test Loss: 0.06562959402799606\n", "Epoch 1434/3000, Training Loss: 0.07582662999629974, Test Loss: 0.06561186164617538\n", "Epoch 1435/3000, Training Loss: 0.0757855549454689, Test Loss: 0.06559418141841888\n", "Epoch 1436/3000, Training Loss: 0.07574456185102463, Test Loss: 0.06557649374008179\n", "Epoch 1437/3000, Training Loss: 0.0757036805152893, Test Loss: 0.06555886566638947\n", "Epoch 1438/3000, Training Loss: 0.07566285878419876, Test Loss: 0.0655413344502449\n", "Epoch 1439/3000, Training Loss: 0.07562210410833359, Test Loss: 0.06552384048700333\n", "Epoch 1440/3000, Training Loss: 0.07558143883943558, Test Loss: 0.0655064806342125\n", "Epoch 1441/3000, Training Loss: 0.07554088532924652, Test Loss: 0.06548909842967987\n", "Epoch 1442/3000, Training Loss: 0.07550038397312164, Test Loss: 0.06547179818153381\n", "Epoch 1443/3000, Training Loss: 0.07545994967222214, Test Loss: 0.06545449048280716\n", "Epoch 1444/3000, Training Loss: 0.07541961967945099, Test Loss: 0.06543734669685364\n", "Epoch 1445/3000, Training Loss: 0.07537934929132462, Test Loss: 0.0654202252626419\n", "Epoch 1446/3000, Training Loss: 0.0753391906619072, Test Loss: 0.06540317833423615\n", "Epoch 1447/3000, Training Loss: 0.07529906928539276, Test Loss: 0.06538616120815277\n", "Epoch 1448/3000, Training Loss: 0.07525904476642609, Test Loss: 0.0653691217303276\n", "Epoch 1449/3000, Training Loss: 0.07521908730268478, Test Loss: 0.06535215675830841\n", "Epoch 1450/3000, Training Loss: 0.07517924904823303, Test Loss: 0.06533531099557877\n", "Epoch 1451/3000, Training Loss: 0.07513943314552307, Test Loss: 0.0653185248374939\n", "Epoch 1452/3000, Training Loss: 0.07509969919919968, Test Loss: 0.06530173122882843\n", "Epoch 1453/3000, Training Loss: 0.07506005465984344, Test Loss: 0.06528693437576294\n", "Epoch 1454/3000, Training Loss: 0.07502052932977676, Test Loss: 0.06527359038591385\n", "Epoch 1455/3000, Training Loss: 0.07498104870319366, Test Loss: 0.0652608647942543\n", "Epoch 1456/3000, Training Loss: 0.07494162768125534, Test Loss: 0.06524787843227386\n", "Epoch 1457/3000, Training Loss: 0.07490230351686478, Test Loss: 0.0652339905500412\n", "Epoch 1458/3000, Training Loss: 0.07486305385828018, Test Loss: 0.06521681696176529\n", "Epoch 1459/3000, Training Loss: 0.07482389360666275, Test Loss: 0.06519683450460434\n", "Epoch 1460/3000, Training Loss: 0.0747847706079483, Test Loss: 0.0651751160621643\n", "Epoch 1461/3000, Training Loss: 0.07474572211503983, Test Loss: 0.06515297293663025\n", "Epoch 1462/3000, Training Loss: 0.0747068002820015, Test Loss: 0.06513357907533646\n", "Epoch 1463/3000, Training Loss: 0.07466791570186615, Test Loss: 0.06511736661195755\n", "Epoch 1464/3000, Training Loss: 0.07462912797927856, Test Loss: 0.06510213762521744\n", "Epoch 1465/3000, Training Loss: 0.07459038496017456, Test Loss: 0.06508755683898926\n", "Epoch 1466/3000, Training Loss: 0.07455171644687653, Test Loss: 0.06507333368062973\n", "Epoch 1467/3000, Training Loss: 0.07451315224170685, Test Loss: 0.06505907326936722\n", "Epoch 1468/3000, Training Loss: 0.07447464019060135, Test Loss: 0.06504442542791367\n", "Epoch 1469/3000, Training Loss: 0.07443620264530182, Test Loss: 0.06502921134233475\n", "Epoch 1470/3000, Training Loss: 0.07439786195755005, Test Loss: 0.06501345336437225\n", "Epoch 1471/3000, Training Loss: 0.07435955107212067, Test Loss: 0.064997099339962\n", "Epoch 1472/3000, Training Loss: 0.07432132214307785, Test Loss: 0.06498221307992935\n", "Epoch 1473/3000, Training Loss: 0.07428320497274399, Test Loss: 0.06496843695640564\n", "Epoch 1474/3000, Training Loss: 0.07424511760473251, Test Loss: 0.06495356559753418\n", "Epoch 1475/3000, Training Loss: 0.0742071345448494, Test Loss: 0.06493747979402542\n", "Epoch 1476/3000, Training Loss: 0.07416921854019165, Test Loss: 0.06492234766483307\n", "Epoch 1477/3000, Training Loss: 0.07413136959075928, Test Loss: 0.06490612775087357\n", "Epoch 1478/3000, Training Loss: 0.07409355789422989, Test Loss: 0.06488906592130661\n", "Epoch 1479/3000, Training Loss: 0.07405588030815125, Test Loss: 0.06487345695495605\n", "Epoch 1480/3000, Training Loss: 0.07401823252439499, Test Loss: 0.06485741585493088\n", "Epoch 1481/3000, Training Loss: 0.0739806517958641, Test Loss: 0.06484115123748779\n", "Epoch 1482/3000, Training Loss: 0.07394316047430038, Test Loss: 0.06482677906751633\n", "Epoch 1483/3000, Training Loss: 0.07390573620796204, Test Loss: 0.0648120641708374\n", "Epoch 1484/3000, Training Loss: 0.07386837154626846, Test Loss: 0.06479690223932266\n", "Epoch 1485/3000, Training Loss: 0.07383108884096146, Test Loss: 0.0647815614938736\n", "Epoch 1486/3000, Training Loss: 0.07379388064146042, Test Loss: 0.06476781517267227\n", "Epoch 1487/3000, Training Loss: 0.07375674694776535, Test Loss: 0.0647534653544426\n", "Epoch 1488/3000, Training Loss: 0.07371965050697327, Test Loss: 0.06474024057388306\n", "Epoch 1489/3000, Training Loss: 0.07368265837430954, Test Loss: 0.06472589820623398\n", "Epoch 1490/3000, Training Loss: 0.0736457109451294, Test Loss: 0.0647105872631073\n", "Epoch 1491/3000, Training Loss: 0.07360884547233582, Test Loss: 0.0646943524479866\n", "Epoch 1492/3000, Training Loss: 0.0735720545053482, Test Loss: 0.06467947363853455\n", "Epoch 1493/3000, Training Loss: 0.07353528589010239, Test Loss: 0.0646640956401825\n", "Epoch 1494/3000, Training Loss: 0.07349865138530731, Test Loss: 0.06465011090040207\n", "Epoch 1495/3000, Training Loss: 0.07346203923225403, Test Loss: 0.06463560461997986\n", "Epoch 1496/3000, Training Loss: 0.0734255313873291, Test Loss: 0.06462058424949646\n", "Epoch 1497/3000, Training Loss: 0.07338906824588776, Test Loss: 0.064605213701725\n", "Epoch 1498/3000, Training Loss: 0.07335268706083298, Test Loss: 0.0645914301276207\n", "Epoch 1499/3000, Training Loss: 0.07331633567810059, Test Loss: 0.06457729637622833\n", "Epoch 1500/3000, Training Loss: 0.07328009605407715, Test Loss: 0.06456281244754791\n", "Epoch 1501/3000, Training Loss: 0.0732438936829567, Test Loss: 0.06454978883266449\n", "Epoch 1502/3000, Training Loss: 0.073207788169384, Test Loss: 0.06453782320022583\n", "Epoch 1503/3000, Training Loss: 0.0731716975569725, Test Loss: 0.06452452391386032\n", "Epoch 1504/3000, Training Loss: 0.07313570380210876, Test Loss: 0.06450989842414856\n", "Epoch 1505/3000, Training Loss: 0.073099784553051, Test Loss: 0.06449423730373383\n", "Epoch 1506/3000, Training Loss: 0.07306395471096039, Test Loss: 0.06447796523571014\n", "Epoch 1507/3000, Training Loss: 0.07302813977003098, Test Loss: 0.0644616112112999\n", "Epoch 1508/3000, Training Loss: 0.07299240678548813, Test Loss: 0.06444742530584335\n", "Epoch 1509/3000, Training Loss: 0.07295671105384827, Test Loss: 0.06443525850772858\n", "Epoch 1510/3000, Training Loss: 0.07292115688323975, Test Loss: 0.0644245445728302\n", "Epoch 1511/3000, Training Loss: 0.07288559526205063, Test Loss: 0.06441275775432587\n", "Epoch 1512/3000, Training Loss: 0.07285013049840927, Test Loss: 0.06439965218305588\n", "Epoch 1513/3000, Training Loss: 0.07281471788883209, Test Loss: 0.06438508629798889\n", "Epoch 1514/3000, Training Loss: 0.07277937978506088, Test Loss: 0.06436951458454132\n", "Epoch 1515/3000, Training Loss: 0.07274411618709564, Test Loss: 0.06435342133045197\n", "Epoch 1516/3000, Training Loss: 0.07270891964435577, Test Loss: 0.0643373429775238\n", "Epoch 1517/3000, Training Loss: 0.0726737380027771, Test Loss: 0.0643235594034195\n", "Epoch 1518/3000, Training Loss: 0.07263866066932678, Test Loss: 0.06431187689304352\n", "Epoch 1519/3000, Training Loss: 0.07260367274284363, Test Loss: 0.06430181860923767\n", "Epoch 1520/3000, Training Loss: 0.07256868481636047, Test Loss: 0.06429067254066467\n", "Epoch 1521/3000, Training Loss: 0.07253380119800568, Test Loss: 0.06427815556526184\n", "Epoch 1522/3000, Training Loss: 0.07249898463487625, Test Loss: 0.0642642006278038\n", "Epoch 1523/3000, Training Loss: 0.07246419787406921, Test Loss: 0.06424906104803085\n", "Epoch 1524/3000, Training Loss: 0.07242949306964874, Test Loss: 0.06423500180244446\n", "Epoch 1525/3000, Training Loss: 0.07239486277103424, Test Loss: 0.0642220601439476\n", "Epoch 1526/3000, Training Loss: 0.07236027717590332, Test Loss: 0.06420847773551941\n", "Epoch 1527/3000, Training Loss: 0.07232574373483658, Test Loss: 0.06419435143470764\n", "Epoch 1528/3000, Training Loss: 0.0722912847995758, Test Loss: 0.0641799047589302\n", "Epoch 1529/3000, Training Loss: 0.0722568929195404, Test Loss: 0.06416721642017365\n", "Epoch 1530/3000, Training Loss: 0.07222256064414978, Test Loss: 0.06415440887212753\n", "Epoch 1531/3000, Training Loss: 0.07218829542398453, Test Loss: 0.06414315849542618\n", "Epoch 1532/3000, Training Loss: 0.07215407490730286, Test Loss: 0.06413049250841141\n", "Epoch 1533/3000, Training Loss: 0.07211993634700775, Test Loss: 0.06411649286746979\n", "Epoch 1534/3000, Training Loss: 0.07208585739135742, Test Loss: 0.06410150229930878\n", "Epoch 1535/3000, Training Loss: 0.07205183058977127, Test Loss: 0.06408786028623581\n", "Epoch 1536/3000, Training Loss: 0.07201787084341049, Test Loss: 0.0640755370259285\n", "Epoch 1537/3000, Training Loss: 0.07198396325111389, Test Loss: 0.06406271457672119\n", "Epoch 1538/3000, Training Loss: 0.07195013761520386, Test Loss: 0.06404948234558105\n", "Epoch 1539/3000, Training Loss: 0.0719163566827774, Test Loss: 0.06403593719005585\n", "Epoch 1540/3000, Training Loss: 0.07188263535499573, Test Loss: 0.0640224739909172\n", "Epoch 1541/3000, Training Loss: 0.07184897363185883, Test Loss: 0.06400912255048752\n", "Epoch 1542/3000, Training Loss: 0.07181534916162491, Test Loss: 0.0639960765838623\n", "Epoch 1543/3000, Training Loss: 0.07178182899951935, Test Loss: 0.06398509442806244\n", "Epoch 1544/3000, Training Loss: 0.07174835354089737, Test Loss: 0.06397568434476852\n", "Epoch 1545/3000, Training Loss: 0.07171490788459778, Test Loss: 0.0639653429389\n", "Epoch 1546/3000, Training Loss: 0.07168155908584595, Test Loss: 0.06395377963781357\n", "Epoch 1547/3000, Training Loss: 0.0716482475399971, Test Loss: 0.06394088268280029\n", "Epoch 1548/3000, Training Loss: 0.07161500304937363, Test Loss: 0.06392694264650345\n", "Epoch 1549/3000, Training Loss: 0.07158183306455612, Test Loss: 0.06391240656375885\n", "Epoch 1550/3000, Training Loss: 0.0715487003326416, Test Loss: 0.06389953196048737\n", "Epoch 1551/3000, Training Loss: 0.07151561975479126, Test Loss: 0.06388824433088303\n", "Epoch 1552/3000, Training Loss: 0.0714825913310051, Test Loss: 0.06387658417224884\n", "Epoch 1553/3000, Training Loss: 0.0714496448636055, Test Loss: 0.06386448442935944\n", "Epoch 1554/3000, Training Loss: 0.07141672819852829, Test Loss: 0.06385201960802078\n", "Epoch 1555/3000, Training Loss: 0.07138390839099884, Test Loss: 0.06383927911520004\n", "Epoch 1556/3000, Training Loss: 0.07135112583637238, Test Loss: 0.06382813304662704\n", "Epoch 1557/3000, Training Loss: 0.0713183730840683, Test Loss: 0.06381820142269135\n", "Epoch 1558/3000, Training Loss: 0.07128570973873138, Test Loss: 0.06380733847618103\n", "Epoch 1559/3000, Training Loss: 0.07125309109687805, Test Loss: 0.06379541754722595\n", "Epoch 1560/3000, Training Loss: 0.0712205320596695, Test Loss: 0.06378252804279327\n", "Epoch 1561/3000, Training Loss: 0.07118802517652512, Test Loss: 0.06376901268959045\n", "Epoch 1562/3000, Training Loss: 0.0711556077003479, Test Loss: 0.06375537067651749\n", "Epoch 1563/3000, Training Loss: 0.07112319767475128, Test Loss: 0.06374350935220718\n", "Epoch 1564/3000, Training Loss: 0.07109084725379944, Test Loss: 0.06373338401317596\n", "Epoch 1565/3000, Training Loss: 0.07105857878923416, Test Loss: 0.06372284144163132\n", "Epoch 1566/3000, Training Loss: 0.07102634757757187, Test Loss: 0.06371170282363892\n", "Epoch 1567/3000, Training Loss: 0.07099416851997375, Test Loss: 0.06369990855455399\n", "Epoch 1568/3000, Training Loss: 0.07096206396818161, Test Loss: 0.06368768215179443\n", "Epoch 1569/3000, Training Loss: 0.07092999666929245, Test Loss: 0.06367665529251099\n", "Epoch 1570/3000, Training Loss: 0.07089800387620926, Test Loss: 0.0636666789650917\n", "Epoch 1571/3000, Training Loss: 0.07086604833602905, Test Loss: 0.06365574896335602\n", "Epoch 1572/3000, Training Loss: 0.07083415240049362, Test Loss: 0.06364386528730392\n", "Epoch 1573/3000, Training Loss: 0.07080232352018356, Test Loss: 0.06363120675086975\n", "Epoch 1574/3000, Training Loss: 0.07077054679393768, Test Loss: 0.06361819058656693\n", "Epoch 1575/3000, Training Loss: 0.0707387924194336, Test Loss: 0.06360675394535065\n", "Epoch 1576/3000, Training Loss: 0.07070712745189667, Test Loss: 0.0635952576994896\n", "Epoch 1577/3000, Training Loss: 0.07067550718784332, Test Loss: 0.06358523666858673\n", "Epoch 1578/3000, Training Loss: 0.07064392417669296, Test Loss: 0.06357476860284805\n", "Epoch 1579/3000, Training Loss: 0.07061240822076797, Test Loss: 0.06356378644704819\n", "Epoch 1580/3000, Training Loss: 0.07058093696832657, Test Loss: 0.06355230510234833\n", "Epoch 1581/3000, Training Loss: 0.07054951786994934, Test Loss: 0.06354197859764099\n", "Epoch 1582/3000, Training Loss: 0.07051816582679749, Test Loss: 0.06353101879358292\n", "Epoch 1583/3000, Training Loss: 0.0704868733882904, Test Loss: 0.0635194182395935\n", "Epoch 1584/3000, Training Loss: 0.0704556331038475, Test Loss: 0.06350897997617722\n", "Epoch 1585/3000, Training Loss: 0.07042442262172699, Test Loss: 0.06349947303533554\n", "Epoch 1586/3000, Training Loss: 0.07039325684309006, Test Loss: 0.06348887830972672\n", "Epoch 1587/3000, Training Loss: 0.07036217302083969, Test Loss: 0.06347736716270447\n", "Epoch 1588/3000, Training Loss: 0.0703311413526535, Test Loss: 0.06346508115530014\n", "Epoch 1589/3000, Training Loss: 0.07030012458562851, Test Loss: 0.06345247477293015\n", "Epoch 1590/3000, Training Loss: 0.07026920467615128, Test Loss: 0.06343996524810791\n", "Epoch 1591/3000, Training Loss: 0.07023831456899643, Test Loss: 0.06342947483062744\n", "Epoch 1592/3000, Training Loss: 0.07020747661590576, Test Loss: 0.06342071294784546\n", "Epoch 1593/3000, Training Loss: 0.07017669826745987, Test Loss: 0.0634131208062172\n", "Epoch 1594/3000, Training Loss: 0.07014593482017517, Test Loss: 0.0634043961763382\n", "Epoch 1595/3000, Training Loss: 0.07011527568101883, Test Loss: 0.06339425593614578\n", "Epoch 1596/3000, Training Loss: 0.07008464634418488, Test Loss: 0.06338269263505936\n", "Epoch 1597/3000, Training Loss: 0.0700540766119957, Test Loss: 0.06337011605501175\n", "Epoch 1598/3000, Training Loss: 0.07002351433038712, Test Loss: 0.06335707008838654\n", "Epoch 1599/3000, Training Loss: 0.0699930489063263, Test Loss: 0.06334414333105087\n", "Epoch 1600/3000, Training Loss: 0.06996261328458786, Test Loss: 0.06333347409963608\n", "Epoch 1601/3000, Training Loss: 0.06993221491575241, Test Loss: 0.06332477182149887\n", "Epoch 1602/3000, Training Loss: 0.06990192085504532, Test Loss: 0.06331755220890045\n", "Epoch 1603/3000, Training Loss: 0.06987163424491882, Test Loss: 0.06330946832895279\n", "Epoch 1604/3000, Training Loss: 0.06984139233827591, Test Loss: 0.06330005824565887\n", "Epoch 1605/3000, Training Loss: 0.06981121003627777, Test Loss: 0.06328940391540527\n", "Epoch 1606/3000, Training Loss: 0.0697811096906662, Test Loss: 0.06327763199806213\n", "Epoch 1607/3000, Training Loss: 0.06975102424621582, Test Loss: 0.06326517462730408\n", "Epoch 1608/3000, Training Loss: 0.06972098350524902, Test Loss: 0.06325261294841766\n", "Epoch 1609/3000, Training Loss: 0.0696909949183464, Test Loss: 0.06324203312397003\n", "Epoch 1610/3000, Training Loss: 0.06966107338666916, Test Loss: 0.06323334574699402\n", "Epoch 1611/3000, Training Loss: 0.0696311742067337, Test Loss: 0.06322598457336426\n", "Epoch 1612/3000, Training Loss: 0.06960135698318481, Test Loss: 0.06321782618761063\n", "Epoch 1613/3000, Training Loss: 0.06957156211137772, Test Loss: 0.06320848315954208\n", "Epoch 1614/3000, Training Loss: 0.0695418268442154, Test Loss: 0.06319794058799744\n", "Epoch 1615/3000, Training Loss: 0.06951212882995605, Test Loss: 0.06318652629852295\n", "Epoch 1616/3000, Training Loss: 0.0694824755191803, Test Loss: 0.06317460536956787\n", "Epoch 1617/3000, Training Loss: 0.0694529116153717, Test Loss: 0.063164122402668\n", "Epoch 1618/3000, Training Loss: 0.06942334771156311, Test Loss: 0.06315504759550095\n", "Epoch 1619/3000, Training Loss: 0.06939386576414108, Test Loss: 0.06314713507890701\n", "Epoch 1620/3000, Training Loss: 0.06936439871788025, Test Loss: 0.06313837319612503\n", "Epoch 1621/3000, Training Loss: 0.06933500617742538, Test Loss: 0.06312866508960724\n", "Epoch 1622/3000, Training Loss: 0.0693056657910347, Test Loss: 0.0631181001663208\n", "Epoch 1623/3000, Training Loss: 0.0692763403058052, Test Loss: 0.06310690194368362\n", "Epoch 1624/3000, Training Loss: 0.06924708187580109, Test Loss: 0.0630955770611763\n", "Epoch 1625/3000, Training Loss: 0.06921786814928055, Test Loss: 0.06308583170175552\n", "Epoch 1626/3000, Training Loss: 0.0691886842250824, Test Loss: 0.0630776584148407\n", "Epoch 1627/3000, Training Loss: 0.0691596120595932, Test Loss: 0.06307057291269302\n", "Epoch 1628/3000, Training Loss: 0.06913050264120102, Test Loss: 0.06306241452693939\n", "Epoch 1629/3000, Training Loss: 0.069101482629776, Test Loss: 0.06305303424596786\n", "Epoch 1630/3000, Training Loss: 0.06907248497009277, Test Loss: 0.0630425363779068\n", "Epoch 1631/3000, Training Loss: 0.06904356926679611, Test Loss: 0.06303131580352783\n", "Epoch 1632/3000, Training Loss: 0.06901467591524124, Test Loss: 0.06301980465650558\n", "Epoch 1633/3000, Training Loss: 0.06898582726716995, Test Loss: 0.06300998479127884\n", "Epoch 1634/3000, Training Loss: 0.06895704567432404, Test Loss: 0.06300172209739685\n", "Epoch 1635/3000, Training Loss: 0.06892828643321991, Test Loss: 0.06299460679292679\n", "Epoch 1636/3000, Training Loss: 0.06889957189559937, Test Loss: 0.06298673897981644\n", "Epoch 1637/3000, Training Loss: 0.06887093186378479, Test Loss: 0.06297777593135834\n", "Epoch 1638/3000, Training Loss: 0.06884229928255081, Test Loss: 0.06296788156032562\n", "Epoch 1639/3000, Training Loss: 0.0688137337565422, Test Loss: 0.06295723468065262\n", "Epoch 1640/3000, Training Loss: 0.06878520548343658, Test Loss: 0.06294625997543335\n", "Epoch 1641/3000, Training Loss: 0.06875672936439514, Test Loss: 0.06293680518865585\n", "Epoch 1642/3000, Training Loss: 0.06872829049825668, Test Loss: 0.06292876601219177\n", "Epoch 1643/3000, Training Loss: 0.0686999037861824, Test Loss: 0.06292184442281723\n", "Epoch 1644/3000, Training Loss: 0.06867155432701111, Test Loss: 0.06291387975215912\n", "Epoch 1645/3000, Training Loss: 0.06864326447248459, Test Loss: 0.0629049614071846\n", "Epoch 1646/3000, Training Loss: 0.06861501932144165, Test Loss: 0.06289511173963547\n", "Epoch 1647/3000, Training Loss: 0.0685868188738823, Test Loss: 0.06288453191518784\n", "Epoch 1648/3000, Training Loss: 0.06855862587690353, Test Loss: 0.0628737136721611\n", "Epoch 1649/3000, Training Loss: 0.06853049248456955, Test Loss: 0.06286448985338211\n", "Epoch 1650/3000, Training Loss: 0.06850241869688034, Test Loss: 0.06285683065652847\n", "Epoch 1651/3000, Training Loss: 0.06847436726093292, Test Loss: 0.06285025179386139\n", "Epoch 1652/3000, Training Loss: 0.06844640523195267, Test Loss: 0.06284283846616745\n", "Epoch 1653/3000, Training Loss: 0.06841843575239182, Test Loss: 0.06283428519964218\n", "Epoch 1654/3000, Training Loss: 0.06839053332805634, Test Loss: 0.06282470375299454\n", "Epoch 1655/3000, Training Loss: 0.06836269050836563, Test Loss: 0.06281439960002899\n", "Epoch 1656/3000, Training Loss: 0.06833486258983612, Test Loss: 0.06280512362718582\n", "Epoch 1657/3000, Training Loss: 0.0683070570230484, Test Loss: 0.062796950340271\n", "Epoch 1658/3000, Training Loss: 0.06827931851148605, Test Loss: 0.06278827786445618\n", "Epoch 1659/3000, Training Loss: 0.06825166940689087, Test Loss: 0.06277924031019211\n", "Epoch 1660/3000, Training Loss: 0.06822401285171509, Test Loss: 0.0627712830901146\n", "Epoch 1661/3000, Training Loss: 0.06819642335176468, Test Loss: 0.06276284158229828\n", "Epoch 1662/3000, Training Loss: 0.06816885620355606, Test Loss: 0.06275391578674316\n", "Epoch 1663/3000, Training Loss: 0.06814134120941162, Test Loss: 0.06274597346782684\n", "Epoch 1664/3000, Training Loss: 0.06811384856700897, Test Loss: 0.06273884326219559\n", "Epoch 1665/3000, Training Loss: 0.0680864006280899, Test Loss: 0.06273077428340912\n", "Epoch 1666/3000, Training Loss: 0.06805901974439621, Test Loss: 0.06272178888320923\n", "Epoch 1667/3000, Training Loss: 0.0680316761136055, Test Loss: 0.0627121552824974\n", "Epoch 1668/3000, Training Loss: 0.06800436228513718, Test Loss: 0.06270351260900497\n", "Epoch 1669/3000, Training Loss: 0.06797710061073303, Test Loss: 0.06269456446170807\n", "Epoch 1670/3000, Training Loss: 0.06794984638690948, Test Loss: 0.0626867413520813\n", "Epoch 1671/3000, Training Loss: 0.0679226890206337, Test Loss: 0.06267987936735153\n", "Epoch 1672/3000, Training Loss: 0.06789553910493851, Test Loss: 0.06267222762107849\n", "Epoch 1673/3000, Training Loss: 0.06786845624446869, Test Loss: 0.0626637414097786\n", "Epoch 1674/3000, Training Loss: 0.06784138828516006, Test Loss: 0.0626545399427414\n", "Epoch 1675/3000, Training Loss: 0.06781437247991562, Test Loss: 0.06264495104551315\n", "Epoch 1676/3000, Training Loss: 0.06778738647699356, Test Loss: 0.06263665854930878\n", "Epoch 1677/3000, Training Loss: 0.0677604228258133, Test Loss: 0.06262955069541931\n", "Epoch 1678/3000, Training Loss: 0.067733533680439, Test Loss: 0.06262203305959702\n", "Epoch 1679/3000, Training Loss: 0.06770669668912888, Test Loss: 0.06261396408081055\n", "Epoch 1680/3000, Training Loss: 0.06767986714839935, Test Loss: 0.06260670721530914\n", "Epoch 1681/3000, Training Loss: 0.0676531046628952, Test Loss: 0.06259859353303909\n", "Epoch 1682/3000, Training Loss: 0.06762636452913284, Test Loss: 0.06259091198444366\n", "Epoch 1683/3000, Training Loss: 0.06759968400001526, Test Loss: 0.06258236616849899\n", "Epoch 1684/3000, Training Loss: 0.06757304817438126, Test Loss: 0.06257322430610657\n", "Epoch 1685/3000, Training Loss: 0.06754646450281143, Test Loss: 0.06256496161222458\n", "Epoch 1686/3000, Training Loss: 0.067519910633564, Test Loss: 0.06255771964788437\n", "Epoch 1687/3000, Training Loss: 0.06749334931373596, Test Loss: 0.06254984438419342\n", "Epoch 1688/3000, Training Loss: 0.06746689230203629, Test Loss: 0.0625414177775383\n", "Epoch 1689/3000, Training Loss: 0.0674404576420784, Test Loss: 0.06253272294998169\n", "Epoch 1690/3000, Training Loss: 0.0674140602350235, Test Loss: 0.06252391636371613\n", "Epoch 1691/3000, Training Loss: 0.06738772243261337, Test Loss: 0.06251657009124756\n", "Epoch 1692/3000, Training Loss: 0.06736137717962265, Test Loss: 0.06251045316457748\n", "Epoch 1693/3000, Training Loss: 0.0673351138830185, Test Loss: 0.06250382214784622\n", "Epoch 1694/3000, Training Loss: 0.06730884313583374, Test Loss: 0.062496382743120193\n", "Epoch 1695/3000, Training Loss: 0.06728265434503555, Test Loss: 0.06248829513788223\n", "Epoch 1696/3000, Training Loss: 0.06725651025772095, Test Loss: 0.06248084455728531\n", "Epoch 1697/3000, Training Loss: 0.06723038107156754, Test Loss: 0.0624740794301033\n", "Epoch 1698/3000, Training Loss: 0.0672043040394783, Test Loss: 0.062466371804475784\n", "Epoch 1699/3000, Training Loss: 0.06717826426029205, Test Loss: 0.062457919120788574\n", "Epoch 1700/3000, Training Loss: 0.06715226173400879, Test Loss: 0.062449004501104355\n", "Epoch 1701/3000, Training Loss: 0.06712629646062851, Test Loss: 0.06244121491909027\n", "Epoch 1702/3000, Training Loss: 0.06710033863782883, Test Loss: 0.06243451312184334\n", "Epoch 1703/3000, Training Loss: 0.06707444787025452, Test Loss: 0.06242739409208298\n", "Epoch 1704/3000, Training Loss: 0.06704860180616379, Test Loss: 0.06241977587342262\n", "Epoch 1705/3000, Training Loss: 0.06702281534671783, Test Loss: 0.06241181865334511\n", "Epoch 1706/3000, Training Loss: 0.0669969990849495, Test Loss: 0.06240489333868027\n", "Epoch 1707/3000, Training Loss: 0.06697128713130951, Test Loss: 0.0623987577855587\n", "Epoch 1708/3000, Training Loss: 0.06694556027650833, Test Loss: 0.06239183992147446\n", "Epoch 1709/3000, Training Loss: 0.06691991537809372, Test Loss: 0.06238415464758873\n", "Epoch 1710/3000, Training Loss: 0.06689425557851791, Test Loss: 0.06237579882144928\n", "Epoch 1711/3000, Training Loss: 0.06686870008707047, Test Loss: 0.06236835569143295\n", "Epoch 1712/3000, Training Loss: 0.06684313714504242, Test Loss: 0.062361761927604675\n", "Epoch 1713/3000, Training Loss: 0.06681764870882034, Test Loss: 0.06235463544726372\n", "Epoch 1714/3000, Training Loss: 0.06679213792085648, Test Loss: 0.062346845865249634\n", "Epoch 1715/3000, Training Loss: 0.06676668673753738, Test Loss: 0.06233870983123779\n", "Epoch 1716/3000, Training Loss: 0.06674128025770187, Test Loss: 0.0623316764831543\n", "Epoch 1717/3000, Training Loss: 0.06671593338251114, Test Loss: 0.06232556700706482\n", "Epoch 1718/3000, Training Loss: 0.06669057905673981, Test Loss: 0.062318943440914154\n", "Epoch 1719/3000, Training Loss: 0.06666529178619385, Test Loss: 0.062311604619026184\n", "Epoch 1720/3000, Training Loss: 0.06664001196622849, Test Loss: 0.06230385974049568\n", "Epoch 1721/3000, Training Loss: 0.0666147992014885, Test Loss: 0.06229698285460472\n", "Epoch 1722/3000, Training Loss: 0.06658961623907089, Test Loss: 0.062290847301483154\n", "Epoch 1723/3000, Training Loss: 0.06656446307897568, Test Loss: 0.06228401139378548\n", "Epoch 1724/3000, Training Loss: 0.06653933972120285, Test Loss: 0.0622764453291893\n", "Epoch 1725/3000, Training Loss: 0.0665142759680748, Test Loss: 0.062268417328596115\n", "Epoch 1726/3000, Training Loss: 0.06648921221494675, Test Loss: 0.062261372804641724\n", "Epoch 1727/3000, Training Loss: 0.06646418571472168, Test Loss: 0.06225528568029404\n", "Epoch 1728/3000, Training Loss: 0.06643922626972198, Test Loss: 0.062249913811683655\n", "Epoch 1729/3000, Training Loss: 0.06641428917646408, Test Loss: 0.06224352493882179\n", "Epoch 1730/3000, Training Loss: 0.06638937443494797, Test Loss: 0.062236130237579346\n", "Epoch 1731/3000, Training Loss: 0.06636454164981842, Test Loss: 0.06222797930240631\n", "Epoch 1732/3000, Training Loss: 0.06633968651294708, Test Loss: 0.06221951171755791\n", "Epoch 1733/3000, Training Loss: 0.06631489098072052, Test Loss: 0.062211018055677414\n", "Epoch 1734/3000, Training Loss: 0.06629013270139694, Test Loss: 0.062204208225011826\n", "Epoch 1735/3000, Training Loss: 0.06626541912555695, Test Loss: 0.062198933213949203\n", "Epoch 1736/3000, Training Loss: 0.06624071300029755, Test Loss: 0.062194596976041794\n", "Epoch 1737/3000, Training Loss: 0.06621601432561874, Test Loss: 0.062189407646656036\n", "Epoch 1738/3000, Training Loss: 0.06619143486022949, Test Loss: 0.06218309327960014\n", "Epoch 1739/3000, Training Loss: 0.06616684049367905, Test Loss: 0.06217566877603531\n", "Epoch 1740/3000, Training Loss: 0.06614227592945099, Test Loss: 0.0621674470603466\n", "Epoch 1741/3000, Training Loss: 0.0661177709698677, Test Loss: 0.062159959226846695\n", "Epoch 1742/3000, Training Loss: 0.06609328836202621, Test Loss: 0.062153447419404984\n", "Epoch 1743/3000, Training Loss: 0.06606882065534592, Test Loss: 0.06214773654937744\n", "Epoch 1744/3000, Training Loss: 0.06604442000389099, Test Loss: 0.06214144453406334\n", "Epoch 1745/3000, Training Loss: 0.06602003425359726, Test Loss: 0.06213449314236641\n", "Epoch 1746/3000, Training Loss: 0.06599567830562592, Test Loss: 0.06212708726525307\n", "Epoch 1747/3000, Training Loss: 0.06597134470939636, Test Loss: 0.06212061643600464\n", "Epoch 1748/3000, Training Loss: 0.06594707816839218, Test Loss: 0.062115028500556946\n", "Epoch 1749/3000, Training Loss: 0.06592284142971039, Test Loss: 0.06210878863930702\n", "Epoch 1750/3000, Training Loss: 0.06589861959218979, Test Loss: 0.062102001160383224\n", "Epoch 1751/3000, Training Loss: 0.06587444990873337, Test Loss: 0.062094733119010925\n", "Epoch 1752/3000, Training Loss: 0.06585028022527695, Test Loss: 0.06208842620253563\n", "Epoch 1753/3000, Training Loss: 0.0658261775970459, Test Loss: 0.062082964926958084\n", "Epoch 1754/3000, Training Loss: 0.06580208986997604, Test Loss: 0.062076862901449203\n", "Epoch 1755/3000, Training Loss: 0.06577804684638977, Test Loss: 0.06207014247775078\n", "Epoch 1756/3000, Training Loss: 0.06575401872396469, Test Loss: 0.06206296756863594\n", "Epoch 1757/3000, Training Loss: 0.0657300278544426, Test Loss: 0.062056757509708405\n", "Epoch 1758/3000, Training Loss: 0.06570609658956528, Test Loss: 0.06205139681696892\n", "Epoch 1759/3000, Training Loss: 0.06568217277526855, Test Loss: 0.0620453767478466\n", "Epoch 1760/3000, Training Loss: 0.06565828621387482, Test Loss: 0.06203875690698624\n", "Epoch 1761/3000, Training Loss: 0.06563442200422287, Test Loss: 0.062032751739025116\n", "Epoch 1762/3000, Training Loss: 0.0656106024980545, Test Loss: 0.06202738359570503\n", "Epoch 1763/3000, Training Loss: 0.06558680534362793, Test Loss: 0.0620211660861969\n", "Epoch 1764/3000, Training Loss: 0.06556307524442673, Test Loss: 0.062014203518629074\n", "Epoch 1765/3000, Training Loss: 0.06553934514522552, Test Loss: 0.06200676038861275\n", "Epoch 1766/3000, Training Loss: 0.0655156672000885, Test Loss: 0.06200036406517029\n", "Epoch 1767/3000, Training Loss: 0.06549199670553207, Test Loss: 0.06199494004249573\n", "Epoch 1768/3000, Training Loss: 0.06546839326620102, Test Loss: 0.06198914349079132\n", "Epoch 1769/3000, Training Loss: 0.06544478982686996, Test Loss: 0.06198294460773468\n", "Epoch 1770/3000, Training Loss: 0.0654212087392807, Test Loss: 0.06197643280029297\n", "Epoch 1771/3000, Training Loss: 0.06539768725633621, Test Loss: 0.06197081133723259\n", "Epoch 1772/3000, Training Loss: 0.06537418812513351, Test Loss: 0.061965908855199814\n", "Epoch 1773/3000, Training Loss: 0.0653507187962532, Test Loss: 0.061961423605680466\n", "Epoch 1774/3000, Training Loss: 0.06532727926969528, Test Loss: 0.06195579469203949\n", "Epoch 1775/3000, Training Loss: 0.06530387699604034, Test Loss: 0.061949100345373154\n", "Epoch 1776/3000, Training Loss: 0.0652805045247078, Test Loss: 0.06194157153367996\n", "Epoch 1777/3000, Training Loss: 0.06525717675685883, Test Loss: 0.0619337223470211\n", "Epoch 1778/3000, Training Loss: 0.06523385643959045, Test Loss: 0.0619271919131279\n", "Epoch 1779/3000, Training Loss: 0.06521057337522507, Test Loss: 0.061921391636133194\n", "Epoch 1780/3000, Training Loss: 0.06518734246492386, Test Loss: 0.061916276812553406\n", "Epoch 1781/3000, Training Loss: 0.06516409665346146, Test Loss: 0.06191053241491318\n", "Epoch 1782/3000, Training Loss: 0.06514092534780502, Test Loss: 0.061904117465019226\n", "Epoch 1783/3000, Training Loss: 0.06511776894330978, Test Loss: 0.06189727038145065\n", "Epoch 1784/3000, Training Loss: 0.06509466469287872, Test Loss: 0.06189030408859253\n", "Epoch 1785/3000, Training Loss: 0.06507156044244766, Test Loss: 0.061883486807346344\n", "Epoch 1786/3000, Training Loss: 0.06504848599433899, Test Loss: 0.06187713146209717\n", "Epoch 1787/3000, Training Loss: 0.06502547860145569, Test Loss: 0.061872366815805435\n", "Epoch 1788/3000, Training Loss: 0.06500249356031418, Test Loss: 0.0618688203394413\n", "Epoch 1789/3000, Training Loss: 0.06497951596975327, Test Loss: 0.06186474487185478\n", "Epoch 1790/3000, Training Loss: 0.06495656073093414, Test Loss: 0.06185976788401604\n", "Epoch 1791/3000, Training Loss: 0.0649336650967598, Test Loss: 0.06185386702418327\n", "Epoch 1792/3000, Training Loss: 0.06491076946258545, Test Loss: 0.06184816360473633\n", "Epoch 1793/3000, Training Loss: 0.06488793343305588, Test Loss: 0.061841629445552826\n", "Epoch 1794/3000, Training Loss: 0.0648651197552681, Test Loss: 0.06183560565114021\n", "Epoch 1795/3000, Training Loss: 0.06484232097864151, Test Loss: 0.0618290975689888\n", "Epoch 1796/3000, Training Loss: 0.0648195669054985, Test Loss: 0.061822403222322464\n", "Epoch 1797/3000, Training Loss: 0.06479683518409729, Test Loss: 0.06181681156158447\n", "Epoch 1798/3000, Training Loss: 0.06477414816617966, Test Loss: 0.06181119382381439\n", "Epoch 1799/3000, Training Loss: 0.06475147604942322, Test Loss: 0.06180555373430252\n", "Epoch 1800/3000, Training Loss: 0.06472882628440857, Test Loss: 0.061799921095371246\n", "Epoch 1801/3000, Training Loss: 0.06470619142055511, Test Loss: 0.06179537624120712\n", "Epoch 1802/3000, Training Loss: 0.06468360871076584, Test Loss: 0.06179150938987732\n", "Epoch 1803/3000, Training Loss: 0.06466107070446014, Test Loss: 0.061786867678165436\n", "Epoch 1804/3000, Training Loss: 0.06463853269815445, Test Loss: 0.06178131699562073\n", "Epoch 1805/3000, Training Loss: 0.06461603194475174, Test Loss: 0.06177497282624245\n", "Epoch 1806/3000, Training Loss: 0.0645935907959938, Test Loss: 0.061768174171447754\n", "Epoch 1807/3000, Training Loss: 0.06457114219665527, Test Loss: 0.06176120787858963\n", "Epoch 1808/3000, Training Loss: 0.06454873830080032, Test Loss: 0.06175556778907776\n", "Epoch 1809/3000, Training Loss: 0.06452636420726776, Test Loss: 0.06175114959478378\n", "Epoch 1810/3000, Training Loss: 0.06450401991605759, Test Loss: 0.06174764037132263\n", "Epoch 1811/3000, Training Loss: 0.06448168307542801, Test Loss: 0.06174341216683388\n", "Epoch 1812/3000, Training Loss: 0.06445938348770142, Test Loss: 0.061738383024930954\n", "Epoch 1813/3000, Training Loss: 0.064437136054039, Test Loss: 0.061732400208711624\n", "Epoch 1814/3000, Training Loss: 0.06441489607095718, Test Loss: 0.06172583997249603\n", "Epoch 1815/3000, Training Loss: 0.06439267843961716, Test Loss: 0.06171899288892746\n", "Epoch 1816/3000, Training Loss: 0.06437051296234131, Test Loss: 0.06171230226755142\n", "Epoch 1817/3000, Training Loss: 0.06434834003448486, Test Loss: 0.06170716509222984\n", "Epoch 1818/3000, Training Loss: 0.0643262043595314, Test Loss: 0.061703383922576904\n", "Epoch 1819/3000, Training Loss: 0.06430415064096451, Test Loss: 0.061699364334344864\n", "Epoch 1820/3000, Training Loss: 0.06428205221891403, Test Loss: 0.061694979667663574\n", "Epoch 1821/3000, Training Loss: 0.06426003575325012, Test Loss: 0.06168995052576065\n", "Epoch 1822/3000, Training Loss: 0.06423801183700562, Test Loss: 0.061685383319854736\n", "Epoch 1823/3000, Training Loss: 0.06421604752540588, Test Loss: 0.06167996674776077\n", "Epoch 1824/3000, Training Loss: 0.06419408321380615, Test Loss: 0.06167500093579292\n", "Epoch 1825/3000, Training Loss: 0.06417214125394821, Test Loss: 0.061669304966926575\n", "Epoch 1826/3000, Training Loss: 0.06415026634931564, Test Loss: 0.06166309490799904\n", "Epoch 1827/3000, Training Loss: 0.06412837654352188, Test Loss: 0.06165776029229164\n", "Epoch 1828/3000, Training Loss: 0.0641065239906311, Test Loss: 0.06165222078561783\n", "Epoch 1829/3000, Training Loss: 0.06408470869064331, Test Loss: 0.06164660304784775\n", "Epoch 1830/3000, Training Loss: 0.0640629380941391, Test Loss: 0.06164213642477989\n", "Epoch 1831/3000, Training Loss: 0.06404116004705429, Test Loss: 0.0616375170648098\n", "Epoch 1832/3000, Training Loss: 0.06401941180229187, Test Loss: 0.06163264438509941\n", "Epoch 1833/3000, Training Loss: 0.06399767845869064, Test Loss: 0.06162756681442261\n", "Epoch 1834/3000, Training Loss: 0.06397604942321777, Test Loss: 0.061623286455869675\n", "Epoch 1835/3000, Training Loss: 0.06395436823368073, Test Loss: 0.061619531363248825\n", "Epoch 1836/3000, Training Loss: 0.06393273919820786, Test Loss: 0.06161506474018097\n", "Epoch 1837/3000, Training Loss: 0.06391112506389618, Test Loss: 0.06160968914628029\n", "Epoch 1838/3000, Training Loss: 0.0638895332813263, Test Loss: 0.06160372868180275\n", "Epoch 1839/3000, Training Loss: 0.06386800110340118, Test Loss: 0.061597466468811035\n", "Epoch 1840/3000, Training Loss: 0.06384648382663727, Test Loss: 0.06159219890832901\n", "Epoch 1841/3000, Training Loss: 0.06382499635219574, Test Loss: 0.06158801540732384\n", "Epoch 1842/3000, Training Loss: 0.063803531229496, Test Loss: 0.06158360093832016\n", "Epoch 1843/3000, Training Loss: 0.06378208100795746, Test Loss: 0.06157894805073738\n", "Epoch 1844/3000, Training Loss: 0.06376063823699951, Test Loss: 0.06157391145825386\n", "Epoch 1845/3000, Training Loss: 0.06373929232358932, Test Loss: 0.061568789184093475\n", "Epoch 1846/3000, Training Loss: 0.06371791660785675, Test Loss: 0.06156443804502487\n", "Epoch 1847/3000, Training Loss: 0.06369657814502716, Test Loss: 0.06156082823872566\n", "Epoch 1848/3000, Training Loss: 0.06367525458335876, Test Loss: 0.061556458473205566\n", "Epoch 1849/3000, Training Loss: 0.06365396082401276, Test Loss: 0.06155142933130264\n", "Epoch 1850/3000, Training Loss: 0.06363268941640854, Test Loss: 0.06154589354991913\n", "Epoch 1851/3000, Training Loss: 0.0636114850640297, Test Loss: 0.061539992690086365\n", "Epoch 1852/3000, Training Loss: 0.06359024345874786, Test Loss: 0.06153516098856926\n", "Epoch 1853/3000, Training Loss: 0.0635690838098526, Test Loss: 0.061531275510787964\n", "Epoch 1854/3000, Training Loss: 0.06354792416095734, Test Loss: 0.061527054756879807\n", "Epoch 1855/3000, Training Loss: 0.06352680921554565, Test Loss: 0.061522476375103\n", "Epoch 1856/3000, Training Loss: 0.06350568681955338, Test Loss: 0.061517562717199326\n", "Epoch 1857/3000, Training Loss: 0.06348460912704468, Test Loss: 0.06151236221194267\n", "Epoch 1858/3000, Training Loss: 0.06346356868743896, Test Loss: 0.06150803342461586\n", "Epoch 1859/3000, Training Loss: 0.06344250589609146, Test Loss: 0.06150441616773605\n", "Epoch 1860/3000, Training Loss: 0.06342153251171112, Test Loss: 0.061500295996665955\n", "Epoch 1861/3000, Training Loss: 0.06340056657791138, Test Loss: 0.06149553135037422\n", "Epoch 1862/3000, Training Loss: 0.06337957829236984, Test Loss: 0.06149015575647354\n", "Epoch 1863/3000, Training Loss: 0.06335868686437607, Test Loss: 0.06148536503314972\n", "Epoch 1864/3000, Training Loss: 0.0633377656340599, Test Loss: 0.06148022413253784\n", "Epoch 1865/3000, Training Loss: 0.06331688910722733, Test Loss: 0.06147492676973343\n", "Epoch 1866/3000, Training Loss: 0.06329606473445892, Test Loss: 0.061470624059438705\n", "Epoch 1867/3000, Training Loss: 0.06327524036169052, Test Loss: 0.06146625429391861\n", "Epoch 1868/3000, Training Loss: 0.0632544681429863, Test Loss: 0.06146184355020523\n", "Epoch 1869/3000, Training Loss: 0.06323368102312088, Test Loss: 0.061458226293325424\n", "Epoch 1870/3000, Training Loss: 0.06321293115615845, Test Loss: 0.061454303562641144\n", "Epoch 1871/3000, Training Loss: 0.063192218542099, Test Loss: 0.06144990772008896\n", "Epoch 1872/3000, Training Loss: 0.06317152082920074, Test Loss: 0.06144503131508827\n", "Epoch 1873/3000, Training Loss: 0.06315082311630249, Test Loss: 0.06143995374441147\n", "Epoch 1874/3000, Training Loss: 0.06313018500804901, Test Loss: 0.06143571063876152\n", "Epoch 1875/3000, Training Loss: 0.06310958415269852, Test Loss: 0.06143226474523544\n", "Epoch 1876/3000, Training Loss: 0.06308896094560623, Test Loss: 0.061428338289260864\n", "Epoch 1877/3000, Training Loss: 0.06306841224431992, Test Loss: 0.0614238977432251\n", "Epoch 1878/3000, Training Loss: 0.063047856092453, Test Loss: 0.06141906604170799\n", "Epoch 1879/3000, Training Loss: 0.06302734464406967, Test Loss: 0.06141391769051552\n", "Epoch 1880/3000, Training Loss: 0.06300682574510574, Test Loss: 0.06140971556305885\n", "Epoch 1881/3000, Training Loss: 0.0629863515496254, Test Loss: 0.061406295746564865\n", "Epoch 1882/3000, Training Loss: 0.06296588480472565, Test Loss: 0.061402447521686554\n", "Epoch 1883/3000, Training Loss: 0.06294546276330948, Test Loss: 0.06139811873435974\n", "Epoch 1884/3000, Training Loss: 0.0629250630736351, Test Loss: 0.06139339506626129\n", "Epoch 1885/3000, Training Loss: 0.06290467083454132, Test Loss: 0.06138845905661583\n", "Epoch 1886/3000, Training Loss: 0.06288434565067291, Test Loss: 0.06138353422284126\n", "Epoch 1887/3000, Training Loss: 0.06286399811506271, Test Loss: 0.06137968599796295\n", "Epoch 1888/3000, Training Loss: 0.0628436878323555, Test Loss: 0.06137579679489136\n", "Epoch 1889/3000, Training Loss: 0.06282339245080948, Test Loss: 0.061372704803943634\n", "Epoch 1890/3000, Training Loss: 0.06280314177274704, Test Loss: 0.0613691583275795\n", "Epoch 1891/3000, Training Loss: 0.0627829059958458, Test Loss: 0.061365045607089996\n", "Epoch 1892/3000, Training Loss: 0.06276264786720276, Test Loss: 0.061360422521829605\n", "Epoch 1893/3000, Training Loss: 0.06274247169494629, Test Loss: 0.061356425285339355\n", "Epoch 1894/3000, Training Loss: 0.06272228807210922, Test Loss: 0.06135198101401329\n", "Epoch 1895/3000, Training Loss: 0.06270215660333633, Test Loss: 0.061347294598817825\n", "Epoch 1896/3000, Training Loss: 0.06268203258514404, Test Loss: 0.06134335324168205\n", "Epoch 1897/3000, Training Loss: 0.06266193836927414, Test Loss: 0.06133919581770897\n", "Epoch 1898/3000, Training Loss: 0.06264189630746841, Test Loss: 0.06133488938212395\n", "Epoch 1899/3000, Training Loss: 0.06262180954217911, Test Loss: 0.06133145093917847\n", "Epoch 1900/3000, Training Loss: 0.06260175257921219, Test Loss: 0.061327673494815826\n", "Epoch 1901/3000, Training Loss: 0.06258175522089005, Test Loss: 0.06132359057664871\n", "Epoch 1902/3000, Training Loss: 0.0625617727637291, Test Loss: 0.0613192543387413\n", "Epoch 1903/3000, Training Loss: 0.06254180520772934, Test Loss: 0.06131557375192642\n", "Epoch 1904/3000, Training Loss: 0.06252186745405197, Test Loss: 0.061311665922403336\n", "Epoch 1905/3000, Training Loss: 0.06250197440385818, Test Loss: 0.06130833178758621\n", "Epoch 1906/3000, Training Loss: 0.06248205900192261, Test Loss: 0.06130446866154671\n", "Epoch 1907/3000, Training Loss: 0.062462177127599716, Test Loss: 0.06130010634660721\n", "Epoch 1908/3000, Training Loss: 0.062442321330308914, Test Loss: 0.06129542738199234\n", "Epoch 1909/3000, Training Loss: 0.0624224990606308, Test Loss: 0.061290640383958817\n", "Epoch 1910/3000, Training Loss: 0.06240271404385567, Test Loss: 0.06128691881895065\n", "Epoch 1911/3000, Training Loss: 0.06238292157649994, Test Loss: 0.061284031718969345\n", "Epoch 1912/3000, Training Loss: 0.0623631589114666, Test Loss: 0.06128087267279625\n", "Epoch 1913/3000, Training Loss: 0.06234341859817505, Test Loss: 0.06127721816301346\n", "Epoch 1914/3000, Training Loss: 0.062323685735464096, Test Loss: 0.061273109167814255\n", "Epoch 1915/3000, Training Loss: 0.06230398640036583, Test Loss: 0.061268582940101624\n", "Epoch 1916/3000, Training Loss: 0.06228431686758995, Test Loss: 0.06126393750309944\n", "Epoch 1917/3000, Training Loss: 0.062264639884233475, Test Loss: 0.06126025691628456\n", "Epoch 1918/3000, Training Loss: 0.06224503740668297, Test Loss: 0.061257317662239075\n", "Epoch 1919/3000, Training Loss: 0.06222541257739067, Test Loss: 0.06125408783555031\n", "Epoch 1920/3000, Training Loss: 0.06220582500100136, Test Loss: 0.06125041842460632\n", "Epoch 1921/3000, Training Loss: 0.06218629330396652, Test Loss: 0.061246357858181\n", "Epoch 1922/3000, Training Loss: 0.062166742980480194, Test Loss: 0.061241984367370605\n", "Epoch 1923/3000, Training Loss: 0.06214721128344536, Test Loss: 0.06123746931552887\n", "Epoch 1924/3000, Training Loss: 0.06212770938873291, Test Loss: 0.06123392656445503\n", "Epoch 1925/3000, Training Loss: 0.06210822984576225, Test Loss: 0.06123122572898865\n", "Epoch 1926/3000, Training Loss: 0.062088776379823685, Test Loss: 0.061228133738040924\n", "Epoch 1927/3000, Training Loss: 0.06206934154033661, Test Loss: 0.06122457608580589\n", "Epoch 1928/3000, Training Loss: 0.06204992160201073, Test Loss: 0.061220500618219376\n", "Epoch 1929/3000, Training Loss: 0.06203052029013634, Test Loss: 0.061216164380311966\n", "Epoch 1930/3000, Training Loss: 0.062011152505874634, Test Loss: 0.061212506145238876\n", "Epoch 1931/3000, Training Loss: 0.061991799622774124, Test Loss: 0.06120872497558594\n", "Epoch 1932/3000, Training Loss: 0.06197245791554451, Test Loss: 0.06120564788579941\n", "Epoch 1933/3000, Training Loss: 0.06195317581295967, Test Loss: 0.06120225414633751\n", "Epoch 1934/3000, Training Loss: 0.061933860182762146, Test Loss: 0.061198487877845764\n", "Epoch 1935/3000, Training Loss: 0.0619145892560482, Test Loss: 0.06119447201490402\n", "Epoch 1936/3000, Training Loss: 0.06189535930752754, Test Loss: 0.06119033694267273\n", "Epoch 1937/3000, Training Loss: 0.06187613308429718, Test Loss: 0.06118706986308098\n", "Epoch 1938/3000, Training Loss: 0.06185692921280861, Test Loss: 0.06118367612361908\n", "Epoch 1939/3000, Training Loss: 0.06183773651719093, Test Loss: 0.061180081218481064\n", "Epoch 1940/3000, Training Loss: 0.06181859225034714, Test Loss: 0.06117729842662811\n", "Epoch 1941/3000, Training Loss: 0.06179944425821304, Test Loss: 0.06117413192987442\n", "Epoch 1942/3000, Training Loss: 0.06178031116724014, Test Loss: 0.06117049604654312\n", "Epoch 1943/3000, Training Loss: 0.06176122650504112, Test Loss: 0.06116652116179466\n", "Epoch 1944/3000, Training Loss: 0.0617421418428421, Test Loss: 0.06116315349936485\n", "Epoch 1945/3000, Training Loss: 0.06172307953238487, Test Loss: 0.06116032227873802\n", "Epoch 1946/3000, Training Loss: 0.061704039573669434, Test Loss: 0.061156973242759705\n", "Epoch 1947/3000, Training Loss: 0.06168503314256668, Test Loss: 0.06115317717194557\n", "Epoch 1948/3000, Training Loss: 0.061666037887334824, Test Loss: 0.06114906817674637\n", "Epoch 1949/3000, Training Loss: 0.06164705380797386, Test Loss: 0.06114481762051582\n", "Epoch 1950/3000, Training Loss: 0.06162809208035469, Test Loss: 0.06114070117473602\n", "Epoch 1951/3000, Training Loss: 0.061609167605638504, Test Loss: 0.06113690137863159\n", "Epoch 1952/3000, Training Loss: 0.061590250581502914, Test Loss: 0.061134256422519684\n", "Epoch 1953/3000, Training Loss: 0.06157133728265762, Test Loss: 0.061132483184337616\n", "Epoch 1954/3000, Training Loss: 0.06155246123671532, Test Loss: 0.061130281537771225\n", "Epoch 1955/3000, Training Loss: 0.06153359264135361, Test Loss: 0.061127305030822754\n", "Epoch 1956/3000, Training Loss: 0.06151476502418518, Test Loss: 0.06112365797162056\n", "Epoch 1957/3000, Training Loss: 0.06149594113230705, Test Loss: 0.06111939623951912\n", "Epoch 1958/3000, Training Loss: 0.06147714704275131, Test Loss: 0.0611148439347744\n", "Epoch 1959/3000, Training Loss: 0.06145836040377617, Test Loss: 0.061111222952604294\n", "Epoch 1960/3000, Training Loss: 0.06143961101770401, Test Loss: 0.061108481138944626\n", "Epoch 1961/3000, Training Loss: 0.06142087280750275, Test Loss: 0.061105646193027496\n", "Epoch 1962/3000, Training Loss: 0.061402156949043274, Test Loss: 0.06110250949859619\n", "Epoch 1963/3000, Training Loss: 0.061383455991744995, Test Loss: 0.061099160462617874\n", "Epoch 1964/3000, Training Loss: 0.06136476993560791, Test Loss: 0.06109560653567314\n", "Epoch 1965/3000, Training Loss: 0.06134612858295441, Test Loss: 0.06109193339943886\n", "Epoch 1966/3000, Training Loss: 0.06132743880152702, Test Loss: 0.061089128255844116\n", "Epoch 1967/3000, Training Loss: 0.061308853328228, Test Loss: 0.06108696013689041\n", "Epoch 1968/3000, Training Loss: 0.061290230602025986, Test Loss: 0.061084259301424026\n", "Epoch 1969/3000, Training Loss: 0.061271633952856064, Test Loss: 0.061081014573574066\n", "Epoch 1970/3000, Training Loss: 0.06125309690833092, Test Loss: 0.061077192425727844\n", "Epoch 1971/3000, Training Loss: 0.06123456358909607, Test Loss: 0.06107315793633461\n", "Epoch 1972/3000, Training Loss: 0.061216019093990326, Test Loss: 0.06106904521584511\n", "Epoch 1973/3000, Training Loss: 0.06119753420352936, Test Loss: 0.06106520816683769\n", "Epoch 1974/3000, Training Loss: 0.061179034411907196, Test Loss: 0.0610625185072422\n", "Epoch 1975/3000, Training Loss: 0.06116054579615593, Test Loss: 0.06106080487370491\n", "Epoch 1976/3000, Training Loss: 0.06114212051033974, Test Loss: 0.061058755964040756\n", "Epoch 1977/3000, Training Loss: 0.06112368777394295, Test Loss: 0.0610562339425087\n", "Epoch 1978/3000, Training Loss: 0.061105258762836456, Test Loss: 0.061053015291690826\n", "Epoch 1979/3000, Training Loss: 0.06108689680695534, Test Loss: 0.0610492005944252\n", "Epoch 1980/3000, Training Loss: 0.061068516224622726, Test Loss: 0.06104510650038719\n", "Epoch 1981/3000, Training Loss: 0.061050161719322205, Test Loss: 0.06104101985692978\n", "Epoch 1982/3000, Training Loss: 0.061031825840473175, Test Loss: 0.06103798374533653\n", "Epoch 1983/3000, Training Loss: 0.06101347133517265, Test Loss: 0.0610358752310276\n", "Epoch 1984/3000, Training Loss: 0.06099517643451691, Test Loss: 0.06103436276316643\n", "Epoch 1985/3000, Training Loss: 0.06097690016031265, Test Loss: 0.06103222444653511\n", "Epoch 1986/3000, Training Loss: 0.0609586201608181, Test Loss: 0.06102928891777992\n", "Epoch 1987/3000, Training Loss: 0.06094039976596832, Test Loss: 0.061025574803352356\n", "Epoch 1988/3000, Training Loss: 0.060922179371118546, Test Loss: 0.06102138012647629\n", "Epoch 1989/3000, Training Loss: 0.06090396270155907, Test Loss: 0.06101713329553604\n", "Epoch 1990/3000, Training Loss: 0.06088577210903168, Test Loss: 0.061013124883174896\n", "Epoch 1991/3000, Training Loss: 0.06086760759353638, Test Loss: 0.06100970506668091\n", "Epoch 1992/3000, Training Loss: 0.060849450528621674, Test Loss: 0.06100687012076378\n", "Epoch 1993/3000, Training Loss: 0.06083129346370697, Test Loss: 0.06100454181432724\n", "Epoch 1994/3000, Training Loss: 0.060813210904598236, Test Loss: 0.06100251525640488\n", "Epoch 1995/3000, Training Loss: 0.06079507991671562, Test Loss: 0.0610012449324131\n", "Epoch 1996/3000, Training Loss: 0.06077700853347778, Test Loss: 0.06100029498338699\n", "Epoch 1997/3000, Training Loss: 0.06075897067785263, Test Loss: 0.06099816784262657\n", "Epoch 1998/3000, Training Loss: 0.060740891844034195, Test Loss: 0.06099485233426094\n", "Epoch 1999/3000, Training Loss: 0.06072289124131203, Test Loss: 0.060990408062934875\n", "Epoch 2000/3000, Training Loss: 0.060704879462718964, Test Loss: 0.06098552048206329\n", "Epoch 2001/3000, Training Loss: 0.060686904937028885, Test Loss: 0.06098148971796036\n", "Epoch 2002/3000, Training Loss: 0.060668956488370895, Test Loss: 0.06097855418920517\n", "Epoch 2003/3000, Training Loss: 0.0606510154902935, Test Loss: 0.06097583845257759\n", "Epoch 2004/3000, Training Loss: 0.06063304841518402, Test Loss: 0.060973454266786575\n", "Epoch 2005/3000, Training Loss: 0.060615137219429016, Test Loss: 0.06097111850976944\n", "Epoch 2006/3000, Training Loss: 0.0605972558259964, Test Loss: 0.060968704521656036\n", "Epoch 2007/3000, Training Loss: 0.06057937070727348, Test Loss: 0.060966167598962784\n", "Epoch 2008/3000, Training Loss: 0.06056152284145355, Test Loss: 0.060963451862335205\n", "Epoch 2009/3000, Training Loss: 0.06054365262389183, Test Loss: 0.06096050143241882\n", "Epoch 2010/3000, Training Loss: 0.06052583083510399, Test Loss: 0.06095748767256737\n", "Epoch 2011/3000, Training Loss: 0.06050802394747734, Test Loss: 0.060954365879297256\n", "Epoch 2012/3000, Training Loss: 0.06049023196101189, Test Loss: 0.06095128506422043\n", "Epoch 2013/3000, Training Loss: 0.06047246232628822, Test Loss: 0.060949135571718216\n", "Epoch 2014/3000, Training Loss: 0.06045471876859665, Test Loss: 0.06094760820269585\n", "Epoch 2015/3000, Training Loss: 0.060436978936195374, Test Loss: 0.06094556674361229\n", "Epoch 2016/3000, Training Loss: 0.060419246554374695, Test Loss: 0.060942910611629486\n", "Epoch 2017/3000, Training Loss: 0.06040157377719879, Test Loss: 0.060939691960811615\n", "Epoch 2018/3000, Training Loss: 0.0603838711977005, Test Loss: 0.060939330607652664\n", "Epoch 2019/3000, Training Loss: 0.060366228222846985, Test Loss: 0.06094071641564369\n", "Epoch 2020/3000, Training Loss: 0.06034860014915466, Test Loss: 0.06094261631369591\n", "Epoch 2021/3000, Training Loss: 0.06033097580075264, Test Loss: 0.0609443336725235\n", "Epoch 2022/3000, Training Loss: 0.06031335890293121, Test Loss: 0.06094462051987648\n", "Epoch 2023/3000, Training Loss: 0.060295771807432175, Test Loss: 0.06094272434711456\n", "Epoch 2024/3000, Training Loss: 0.06027820706367493, Test Loss: 0.06093854829668999\n", "Epoch 2025/3000, Training Loss: 0.06026066467165947, Test Loss: 0.060932647436857224\n", "Epoch 2026/3000, Training Loss: 0.06024310365319252, Test Loss: 0.06092599779367447\n", "Epoch 2027/3000, Training Loss: 0.06022562086582184, Test Loss: 0.06091971695423126\n", "Epoch 2028/3000, Training Loss: 0.06020810455083847, Test Loss: 0.06091467663645744\n", "Epoch 2029/3000, Training Loss: 0.060190603137016296, Test Loss: 0.06091132014989853\n", "Epoch 2030/3000, Training Loss: 0.06017312407493591, Test Loss: 0.06090960279107094\n", "Epoch 2031/3000, Training Loss: 0.06015567108988762, Test Loss: 0.06090901046991348\n", "Epoch 2032/3000, Training Loss: 0.06013822555541992, Test Loss: 0.060908738523721695\n", "Epoch 2033/3000, Training Loss: 0.06012081727385521, Test Loss: 0.06090803071856499\n", "Epoch 2034/3000, Training Loss: 0.060103390365839005, Test Loss: 0.06090620532631874\n", "Epoch 2035/3000, Training Loss: 0.060086000710725784, Test Loss: 0.0609000138938427\n", "Epoch 2036/3000, Training Loss: 0.060068659484386444, Test Loss: 0.060890864580869675\n", "Epoch 2037/3000, Training Loss: 0.0600513219833374, Test Loss: 0.060880761593580246\n", "Epoch 2038/3000, Training Loss: 0.06003396213054657, Test Loss: 0.06087189540266991\n", "Epoch 2039/3000, Training Loss: 0.06001664325594902, Test Loss: 0.060865871608257294\n", "Epoch 2040/3000, Training Loss: 0.05999933183193207, Test Loss: 0.060863375663757324\n", "Epoch 2041/3000, Training Loss: 0.0599820651113987, Test Loss: 0.06086711958050728\n", "Epoch 2042/3000, Training Loss: 0.05996476486325264, Test Loss: 0.06087476760149002\n", "Epoch 2043/3000, Training Loss: 0.05994749069213867, Test Loss: 0.06088307127356529\n", "Epoch 2044/3000, Training Loss: 0.05993027240037918, Test Loss: 0.060888733714818954\n", "Epoch 2045/3000, Training Loss: 0.0599130317568779, Test Loss: 0.06088954210281372\n", "Epoch 2046/3000, Training Loss: 0.0598958320915699, Test Loss: 0.060885027050971985\n", "Epoch 2047/3000, Training Loss: 0.05987866595387459, Test Loss: 0.06087315082550049\n", "Epoch 2048/3000, Training Loss: 0.05986149609088898, Test Loss: 0.0608602836728096\n", "Epoch 2049/3000, Training Loss: 0.05984433367848396, Test Loss: 0.060849521309137344\n", "Epoch 2050/3000, Training Loss: 0.05982718616724014, Test Loss: 0.06084299832582474\n", "Epoch 2051/3000, Training Loss: 0.05981007218360901, Test Loss: 0.06084135174751282\n", "Epoch 2052/3000, Training Loss: 0.05979296937584877, Test Loss: 0.060840073972940445\n", "Epoch 2053/3000, Training Loss: 0.05977589264512062, Test Loss: 0.060838863253593445\n", "Epoch 2054/3000, Training Loss: 0.059758808463811874, Test Loss: 0.060837358236312866\n", "Epoch 2055/3000, Training Loss: 0.05974176526069641, Test Loss: 0.06083827465772629\n", "Epoch 2056/3000, Training Loss: 0.05972471833229065, Test Loss: 0.06084054335951805\n", "Epoch 2057/3000, Training Loss: 0.05970769748091698, Test Loss: 0.06084273010492325\n", "Epoch 2058/3000, Training Loss: 0.0596906803548336, Test Loss: 0.06084346026182175\n", "Epoch 2059/3000, Training Loss: 0.05967368930578232, Test Loss: 0.06083889678120613\n", "Epoch 2060/3000, Training Loss: 0.05965673550963402, Test Loss: 0.06083017215132713\n", "Epoch 2061/3000, Training Loss: 0.059639766812324524, Test Loss: 0.060822632163763046\n", "Epoch 2062/3000, Training Loss: 0.05962281674146652, Test Loss: 0.06081819534301758\n", "Epoch 2063/3000, Training Loss: 0.05960588529706001, Test Loss: 0.06081720441579819\n", "Epoch 2064/3000, Training Loss: 0.059588972479104996, Test Loss: 0.06081900745630264\n", "Epoch 2065/3000, Training Loss: 0.05957207828760147, Test Loss: 0.0608220100402832\n", "Epoch 2066/3000, Training Loss: 0.059555187821388245, Test Loss: 0.0608215257525444\n", "Epoch 2067/3000, Training Loss: 0.05953830108046532, Test Loss: 0.060817357152700424\n", "Epoch 2068/3000, Training Loss: 0.059521470218896866, Test Loss: 0.060813501477241516\n", "Epoch 2069/3000, Training Loss: 0.05950459837913513, Test Loss: 0.06081034243106842\n", "Epoch 2070/3000, Training Loss: 0.05948776751756668, Test Loss: 0.06080802157521248\n", "Epoch 2071/3000, Training Loss: 0.05947098881006241, Test Loss: 0.06080657243728638\n", "Epoch 2072/3000, Training Loss: 0.05945420637726784, Test Loss: 0.060805536806583405\n", "Epoch 2073/3000, Training Loss: 0.05943740904331207, Test Loss: 0.06080145388841629\n", "Epoch 2074/3000, Training Loss: 0.0594206377863884, Test Loss: 0.060797590762376785\n", "Epoch 2075/3000, Training Loss: 0.0594039224088192, Test Loss: 0.060791365802288055\n", "Epoch 2076/3000, Training Loss: 0.059387169778347015, Test Loss: 0.060787759721279144\n", "Epoch 2077/3000, Training Loss: 0.059370461851358414, Test Loss: 0.06078638881444931\n", "Epoch 2078/3000, Training Loss: 0.05935375764966011, Test Loss: 0.060786642134189606\n", "Epoch 2079/3000, Training Loss: 0.05933709442615509, Test Loss: 0.060785312205553055\n", "Epoch 2080/3000, Training Loss: 0.059320393949747086, Test Loss: 0.06078466400504112\n", "Epoch 2081/3000, Training Loss: 0.05930374190211296, Test Loss: 0.060784850269556046\n", "Epoch 2082/3000, Training Loss: 0.05928708240389824, Test Loss: 0.06078201159834862\n", "Epoch 2083/3000, Training Loss: 0.059270452708005905, Test Loss: 0.06077604368329048\n", "Epoch 2084/3000, Training Loss: 0.059253837913274765, Test Loss: 0.060771968215703964\n", "Epoch 2085/3000, Training Loss: 0.05923723801970482, Test Loss: 0.060770269483327866\n", "Epoch 2086/3000, Training Loss: 0.05922064930200577, Test Loss: 0.060769811272621155\n", "Epoch 2087/3000, Training Loss: 0.059204090386629105, Test Loss: 0.06077002361416817\n", "Epoch 2088/3000, Training Loss: 0.05918752774596214, Test Loss: 0.06077074632048607\n", "Epoch 2089/3000, Training Loss: 0.05917096510529518, Test Loss: 0.0607711523771286\n", "Epoch 2090/3000, Training Loss: 0.0591544471681118, Test Loss: 0.060770414769649506\n", "Epoch 2091/3000, Training Loss: 0.05913794785737991, Test Loss: 0.060765381902456284\n", "Epoch 2092/3000, Training Loss: 0.059121448546648026, Test Loss: 0.060757193714380264\n", "Epoch 2093/3000, Training Loss: 0.05910497531294823, Test Loss: 0.06075095385313034\n", "Epoch 2094/3000, Training Loss: 0.05908850207924843, Test Loss: 0.06074763089418411\n", "Epoch 2095/3000, Training Loss: 0.05907201021909714, Test Loss: 0.06074734404683113\n", "Epoch 2096/3000, Training Loss: 0.05905560776591301, Test Loss: 0.060748595744371414\n", "Epoch 2097/3000, Training Loss: 0.05903917923569679, Test Loss: 0.06075039505958557\n", "Epoch 2098/3000, Training Loss: 0.05902274325489998, Test Loss: 0.06074830889701843\n", "Epoch 2099/3000, Training Loss: 0.05900634080171585, Test Loss: 0.06074286252260208\n", "Epoch 2100/3000, Training Loss: 0.05898996815085411, Test Loss: 0.06073823198676109\n", "Epoch 2101/3000, Training Loss: 0.05897359177470207, Test Loss: 0.060735952109098434\n", "Epoch 2102/3000, Training Loss: 0.05895725265145302, Test Loss: 0.06073573976755142\n", "Epoch 2103/3000, Training Loss: 0.05894089862704277, Test Loss: 0.06073679029941559\n", "Epoch 2104/3000, Training Loss: 0.05892454832792282, Test Loss: 0.06073809787631035\n", "Epoch 2105/3000, Training Loss: 0.05890824645757675, Test Loss: 0.06073569506406784\n", "Epoch 2106/3000, Training Loss: 0.05889197066426277, Test Loss: 0.06072996184229851\n", "Epoch 2107/3000, Training Loss: 0.0588756762444973, Test Loss: 0.06072479113936424\n", "Epoch 2108/3000, Training Loss: 0.05885940045118332, Test Loss: 0.060721658170223236\n", "Epoch 2109/3000, Training Loss: 0.05884310230612755, Test Loss: 0.06072006747126579\n", "Epoch 2110/3000, Training Loss: 0.05882688984274864, Test Loss: 0.06071978434920311\n", "Epoch 2111/3000, Training Loss: 0.05881065875291824, Test Loss: 0.06072068586945534\n", "Epoch 2112/3000, Training Loss: 0.05879443511366844, Test Loss: 0.060721930116415024\n", "Epoch 2113/3000, Training Loss: 0.05877821519970894, Test Loss: 0.06071886047720909\n", "Epoch 2114/3000, Training Loss: 0.05876203998923302, Test Loss: 0.06071581691503525\n", "Epoch 2115/3000, Training Loss: 0.0587458610534668, Test Loss: 0.06071305647492409\n", "Epoch 2116/3000, Training Loss: 0.05872968211770058, Test Loss: 0.06070789322257042\n", "Epoch 2117/3000, Training Loss: 0.058713532984256744, Test Loss: 0.06070443615317345\n", "Epoch 2118/3000, Training Loss: 0.05869743227958679, Test Loss: 0.06070294603705406\n", "Epoch 2119/3000, Training Loss: 0.058681290596723557, Test Loss: 0.06070321425795555\n", "Epoch 2120/3000, Training Loss: 0.05866517499089241, Test Loss: 0.06070365384221077\n", "Epoch 2121/3000, Training Loss: 0.05864908918738365, Test Loss: 0.0607006698846817\n", "Epoch 2122/3000, Training Loss: 0.058632995933294296, Test Loss: 0.06069789081811905\n", "Epoch 2123/3000, Training Loss: 0.05861692503094673, Test Loss: 0.06069550663232803\n", "Epoch 2124/3000, Training Loss: 0.05860085040330887, Test Loss: 0.06069422513246536\n", "Epoch 2125/3000, Training Loss: 0.05858483165502548, Test Loss: 0.060690853744745255\n", "Epoch 2126/3000, Training Loss: 0.058568794280290604, Test Loss: 0.060688890516757965\n", "Epoch 2127/3000, Training Loss: 0.05855279415845871, Test Loss: 0.06068830564618111\n", "Epoch 2128/3000, Training Loss: 0.05853678286075592, Test Loss: 0.060688506811857224\n", "Epoch 2129/3000, Training Loss: 0.05852081999182701, Test Loss: 0.06068596988916397\n", "Epoch 2130/3000, Training Loss: 0.05850483477115631, Test Loss: 0.06068342924118042\n", "Epoch 2131/3000, Training Loss: 0.058488864451646805, Test Loss: 0.06068113446235657\n", "Epoch 2132/3000, Training Loss: 0.0584728866815567, Test Loss: 0.06067918241024017\n", "Epoch 2133/3000, Training Loss: 0.058456968516111374, Test Loss: 0.06067812442779541\n", "Epoch 2134/3000, Training Loss: 0.05844104290008545, Test Loss: 0.06067774444818497\n", "Epoch 2135/3000, Training Loss: 0.058425113558769226, Test Loss: 0.06067454814910889\n", "Epoch 2136/3000, Training Loss: 0.05840922147035599, Test Loss: 0.06067206710577011\n", "Epoch 2137/3000, Training Loss: 0.05839335173368454, Test Loss: 0.06067054718732834\n", "Epoch 2138/3000, Training Loss: 0.05837748199701309, Test Loss: 0.060669757425785065\n", "Epoch 2139/3000, Training Loss: 0.05836163088679314, Test Loss: 0.06066587567329407\n", "Epoch 2140/3000, Training Loss: 0.05834577605128288, Test Loss: 0.06066269427537918\n", "Epoch 2141/3000, Training Loss: 0.05832991376519203, Test Loss: 0.06066063418984413\n", "Epoch 2142/3000, Training Loss: 0.058314088732004166, Test Loss: 0.06066015362739563\n", "Epoch 2143/3000, Training Loss: 0.058298319578170776, Test Loss: 0.06066078692674637\n", "Epoch 2144/3000, Training Loss: 0.05828249454498291, Test Loss: 0.06065835803747177\n", "Epoch 2145/3000, Training Loss: 0.058266740292310715, Test Loss: 0.06065637618303299\n", "Epoch 2146/3000, Training Loss: 0.05825097858905792, Test Loss: 0.0606541745364666\n", "Epoch 2147/3000, Training Loss: 0.05823522433638573, Test Loss: 0.0606519840657711\n", "Epoch 2148/3000, Training Loss: 0.05821951478719711, Test Loss: 0.06065003201365471\n", "Epoch 2149/3000, Training Loss: 0.05820378661155701, Test Loss: 0.060648709535598755\n", "Epoch 2150/3000, Training Loss: 0.0581880584359169, Test Loss: 0.06064511090517044\n", "Epoch 2151/3000, Training Loss: 0.05817237123847008, Test Loss: 0.06064001843333244\n", "Epoch 2152/3000, Training Loss: 0.05815668776631355, Test Loss: 0.060637474060058594\n", "Epoch 2153/3000, Training Loss: 0.058140989392995834, Test Loss: 0.060636959969997406\n", "Epoch 2154/3000, Training Loss: 0.05812535434961319, Test Loss: 0.06063785031437874\n", "Epoch 2155/3000, Training Loss: 0.058109696954488754, Test Loss: 0.06063932552933693\n", "Epoch 2156/3000, Training Loss: 0.0580940768122673, Test Loss: 0.060640327632427216\n", "Epoch 2157/3000, Training Loss: 0.05807841569185257, Test Loss: 0.060640111565589905\n", "Epoch 2158/3000, Training Loss: 0.05806282162666321, Test Loss: 0.06063835695385933\n", "Epoch 2159/3000, Training Loss: 0.058047231286764145, Test Loss: 0.06063520535826683\n", "Epoch 2160/3000, Training Loss: 0.05803166702389717, Test Loss: 0.060629017651081085\n", "Epoch 2161/3000, Training Loss: 0.05801607668399811, Test Loss: 0.06062139570713043\n", "Epoch 2162/3000, Training Loss: 0.058000531047582626, Test Loss: 0.060617152601480484\n", "Epoch 2163/3000, Training Loss: 0.057984981685876846, Test Loss: 0.060616765171289444\n", "Epoch 2164/3000, Training Loss: 0.05796944350004196, Test Loss: 0.0606195330619812\n", "Epoch 2165/3000, Training Loss: 0.05795391649007797, Test Loss: 0.06062374264001846\n", "Epoch 2166/3000, Training Loss: 0.05793839320540428, Test Loss: 0.06062700226902962\n", "Epoch 2167/3000, Training Loss: 0.057922910898923874, Test Loss: 0.06062787398695946\n", "Epoch 2168/3000, Training Loss: 0.05790740251541138, Test Loss: 0.060623060911893845\n", "Epoch 2169/3000, Training Loss: 0.05789196118712425, Test Loss: 0.060616906732320786\n", "Epoch 2170/3000, Training Loss: 0.05787649750709534, Test Loss: 0.06061095371842384\n", "Epoch 2171/3000, Training Loss: 0.05786105617880821, Test Loss: 0.06060700863599777\n", "Epoch 2172/3000, Training Loss: 0.05784561112523079, Test Loss: 0.06060562655329704\n", "Epoch 2173/3000, Training Loss: 0.05783018842339516, Test Loss: 0.06060653552412987\n", "Epoch 2174/3000, Training Loss: 0.05781477689743042, Test Loss: 0.06060582026839256\n", "Epoch 2175/3000, Training Loss: 0.057799339294433594, Test Loss: 0.06060289964079857\n", "Epoch 2176/3000, Training Loss: 0.05778399109840393, Test Loss: 0.06060126796364784\n", "Epoch 2177/3000, Training Loss: 0.0577685721218586, Test Loss: 0.06060075759887695\n", "Epoch 2178/3000, Training Loss: 0.057753220200538635, Test Loss: 0.060600992292165756\n", "Epoch 2179/3000, Training Loss: 0.05773787945508957, Test Loss: 0.060601405799388885\n", "Epoch 2180/3000, Training Loss: 0.05772251635789871, Test Loss: 0.06060134992003441\n", "Epoch 2181/3000, Training Loss: 0.057707201689481735, Test Loss: 0.060600388795137405\n", "Epoch 2182/3000, Training Loss: 0.057691872119903564, Test Loss: 0.06059902161359787\n", "Epoch 2183/3000, Training Loss: 0.05767655745148659, Test Loss: 0.060597166419029236\n", "Epoch 2184/3000, Training Loss: 0.05766129121184349, Test Loss: 0.06059220805764198\n", "Epoch 2185/3000, Training Loss: 0.05764598026871681, Test Loss: 0.060585517436265945\n", "Epoch 2186/3000, Training Loss: 0.05763067305088043, Test Loss: 0.060581646859645844\n", "Epoch 2187/3000, Training Loss: 0.057615432888269424, Test Loss: 0.060581061989068985\n", "Epoch 2188/3000, Training Loss: 0.05760018154978752, Test Loss: 0.06058317795395851\n", "Epoch 2189/3000, Training Loss: 0.05758494511246681, Test Loss: 0.0605861097574234\n", "Epoch 2190/3000, Training Loss: 0.05756969377398491, Test Loss: 0.06058841571211815\n", "Epoch 2191/3000, Training Loss: 0.05755451321601868, Test Loss: 0.060589008033275604\n", "Epoch 2192/3000, Training Loss: 0.05753929913043976, Test Loss: 0.06058739870786667\n", "Epoch 2193/3000, Training Loss: 0.05752410367131233, Test Loss: 0.06058388948440552\n", "Epoch 2194/3000, Training Loss: 0.0575089193880558, Test Loss: 0.060579244047403336\n", "Epoch 2195/3000, Training Loss: 0.05749373883008957, Test Loss: 0.06057458743453026\n", "Epoch 2196/3000, Training Loss: 0.05747856944799423, Test Loss: 0.060568515211343765\n", "Epoch 2197/3000, Training Loss: 0.05746343731880188, Test Loss: 0.060562729835510254\n", "Epoch 2198/3000, Training Loss: 0.057448290288448334, Test Loss: 0.060561299324035645\n", "Epoch 2199/3000, Training Loss: 0.05743316188454628, Test Loss: 0.060563825070858\n", "Epoch 2200/3000, Training Loss: 0.057418059557676315, Test Loss: 0.06056862697005272\n", "Epoch 2201/3000, Training Loss: 0.05740293487906456, Test Loss: 0.06057319790124893\n", "Epoch 2202/3000, Training Loss: 0.05738786235451698, Test Loss: 0.060575611889362335\n", "Epoch 2203/3000, Training Loss: 0.057372793555259705, Test Loss: 0.06057487428188324\n", "Epoch 2204/3000, Training Loss: 0.05735771358013153, Test Loss: 0.06057119369506836\n", "Epoch 2205/3000, Training Loss: 0.057342663407325745, Test Loss: 0.06056540831923485\n", "Epoch 2206/3000, Training Loss: 0.05732761323451996, Test Loss: 0.060556571930646896\n", "Epoch 2207/3000, Training Loss: 0.057312533259391785, Test Loss: 0.06055032089352608\n", "Epoch 2208/3000, Training Loss: 0.05729754641652107, Test Loss: 0.06054801121354103\n", "Epoch 2209/3000, Training Loss: 0.05728251859545708, Test Loss: 0.06054960936307907\n", "Epoch 2210/3000, Training Loss: 0.05726751312613487, Test Loss: 0.06055373325943947\n", "Epoch 2211/3000, Training Loss: 0.05725253373384476, Test Loss: 0.06055846065282822\n", "Epoch 2212/3000, Training Loss: 0.05723752826452255, Test Loss: 0.06055854633450508\n", "Epoch 2213/3000, Training Loss: 0.057222556322813034, Test Loss: 0.06055409088730812\n", "Epoch 2214/3000, Training Loss: 0.05720759555697441, Test Loss: 0.060549184679985046\n", "Epoch 2215/3000, Training Loss: 0.057192668318748474, Test Loss: 0.060545068234205246\n", "Epoch 2216/3000, Training Loss: 0.05717770755290985, Test Loss: 0.06054253876209259\n", "Epoch 2217/3000, Training Loss: 0.057162780314683914, Test Loss: 0.060541801154613495\n", "Epoch 2218/3000, Training Loss: 0.05714787170290947, Test Loss: 0.060542602092027664\n", "Epoch 2219/3000, Training Loss: 0.05713295564055443, Test Loss: 0.06054408848285675\n", "Epoch 2220/3000, Training Loss: 0.057118047028779984, Test Loss: 0.06054520234465599\n", "Epoch 2221/3000, Training Loss: 0.057103194296360016, Test Loss: 0.06054539233446121\n", "Epoch 2222/3000, Training Loss: 0.05708829313516617, Test Loss: 0.0605446957051754\n", "Epoch 2223/3000, Training Loss: 0.057073432952165604, Test Loss: 0.06054302677512169\n", "Epoch 2224/3000, Training Loss: 0.057058583945035934, Test Loss: 0.06054052338004112\n", "Epoch 2225/3000, Training Loss: 0.05704373121261597, Test Loss: 0.060537807643413544\n", "Epoch 2226/3000, Training Loss: 0.05702889710664749, Test Loss: 0.06053520366549492\n", "Epoch 2227/3000, Training Loss: 0.05701407790184021, Test Loss: 0.06053321063518524\n", "Epoch 2228/3000, Training Loss: 0.056999269872903824, Test Loss: 0.060531921684741974\n", "Epoch 2229/3000, Training Loss: 0.05698443576693535, Test Loss: 0.060528673231601715\n", "Epoch 2230/3000, Training Loss: 0.05696965008974075, Test Loss: 0.06052688509225845\n", "Epoch 2231/3000, Training Loss: 0.05695487931370735, Test Loss: 0.060526713728904724\n", "Epoch 2232/3000, Training Loss: 0.056940123438835144, Test Loss: 0.06052708253264427\n", "Epoch 2233/3000, Training Loss: 0.056925367563962936, Test Loss: 0.06052742153406143\n", "Epoch 2234/3000, Training Loss: 0.05691060423851013, Test Loss: 0.060524653643369675\n", "Epoch 2235/3000, Training Loss: 0.05689585208892822, Test Loss: 0.0605221763253212\n", "Epoch 2236/3000, Training Loss: 0.056881122291088104, Test Loss: 0.0605204775929451\n", "Epoch 2237/3000, Training Loss: 0.05686638504266739, Test Loss: 0.06051965802907944\n", "Epoch 2238/3000, Training Loss: 0.05685168132185936, Test Loss: 0.06051942706108093\n", "Epoch 2239/3000, Training Loss: 0.05683702602982521, Test Loss: 0.060519445687532425\n", "Epoch 2240/3000, Training Loss: 0.05682231858372688, Test Loss: 0.06051942706108093\n", "Epoch 2241/3000, Training Loss: 0.05680764466524124, Test Loss: 0.06051885709166527\n", "Epoch 2242/3000, Training Loss: 0.056792959570884705, Test Loss: 0.060517679899930954\n", "Epoch 2243/3000, Training Loss: 0.05677831172943115, Test Loss: 0.06051592528820038\n", "Epoch 2244/3000, Training Loss: 0.056763648986816406, Test Loss: 0.06051439419388771\n", "Epoch 2245/3000, Training Loss: 0.05674903839826584, Test Loss: 0.06051037460565567\n", "Epoch 2246/3000, Training Loss: 0.05673440173268318, Test Loss: 0.06050780788064003\n", "Epoch 2247/3000, Training Loss: 0.05671979859471321, Test Loss: 0.06050701066851616\n", "Epoch 2248/3000, Training Loss: 0.056705180555582047, Test Loss: 0.060507699847221375\n", "Epoch 2249/3000, Training Loss: 0.05669056624174118, Test Loss: 0.06050905957818031\n", "Epoch 2250/3000, Training Loss: 0.05667596682906151, Test Loss: 0.060510411858558655\n", "Epoch 2251/3000, Training Loss: 0.05666140094399452, Test Loss: 0.06051040068268776\n", "Epoch 2252/3000, Training Loss: 0.05664685741066933, Test Loss: 0.060508809983730316\n", "Epoch 2253/3000, Training Loss: 0.056632284075021744, Test Loss: 0.06050607189536095\n", "Epoch 2254/3000, Training Loss: 0.05661775544285774, Test Loss: 0.060502637177705765\n", "Epoch 2255/3000, Training Loss: 0.05660317838191986, Test Loss: 0.06049952656030655\n", "Epoch 2256/3000, Training Loss: 0.05658864974975586, Test Loss: 0.060497213155031204\n", "Epoch 2257/3000, Training Loss: 0.05657416209578514, Test Loss: 0.06049646437168121\n", "Epoch 2258/3000, Training Loss: 0.05655965954065323, Test Loss: 0.06049448624253273\n", "Epoch 2259/3000, Training Loss: 0.056545160710811615, Test Loss: 0.060494352132081985\n", "Epoch 2260/3000, Training Loss: 0.0565306656062603, Test Loss: 0.060494937002658844\n", "Epoch 2261/3000, Training Loss: 0.05651616305112839, Test Loss: 0.06049590930342674\n", "Epoch 2262/3000, Training Loss: 0.056501712650060654, Test Loss: 0.060496456921100616\n", "Epoch 2263/3000, Training Loss: 0.056487247347831726, Test Loss: 0.06049610301852226\n", "Epoch 2264/3000, Training Loss: 0.05647279694676399, Test Loss: 0.06049482524394989\n", "Epoch 2265/3000, Training Loss: 0.056458353996276855, Test Loss: 0.06049269437789917\n", "Epoch 2266/3000, Training Loss: 0.05644393339753151, Test Loss: 0.0604902021586895\n", "Epoch 2267/3000, Training Loss: 0.05642950162291527, Test Loss: 0.060488324612379074\n", "Epoch 2268/3000, Training Loss: 0.05641508847475052, Test Loss: 0.06048721447587013\n", "Epoch 2269/3000, Training Loss: 0.05640070140361786, Test Loss: 0.06048690900206566\n", "Epoch 2270/3000, Training Loss: 0.0563863180577755, Test Loss: 0.060487087815999985\n", "Epoch 2271/3000, Training Loss: 0.05637189745903015, Test Loss: 0.060487303882837296\n", "Epoch 2272/3000, Training Loss: 0.05635751038789749, Test Loss: 0.06048668548464775\n", "Epoch 2273/3000, Training Loss: 0.05634316802024841, Test Loss: 0.06048266589641571\n", "Epoch 2274/3000, Training Loss: 0.05632880702614784, Test Loss: 0.06047913059592247\n", "Epoch 2275/3000, Training Loss: 0.05631444230675697, Test Loss: 0.06047690659761429\n", "Epoch 2276/3000, Training Loss: 0.05630013719201088, Test Loss: 0.060476239770650864\n", "Epoch 2277/3000, Training Loss: 0.0562858060002327, Test Loss: 0.060476940125226974\n", "Epoch 2278/3000, Training Loss: 0.05627148970961571, Test Loss: 0.06047840788960457\n", "Epoch 2279/3000, Training Loss: 0.05625719204545021, Test Loss: 0.06047963351011276\n", "Epoch 2280/3000, Training Loss: 0.056242868304252625, Test Loss: 0.06048005446791649\n", "Epoch 2281/3000, Training Loss: 0.05622858554124832, Test Loss: 0.060479290783405304\n", "Epoch 2282/3000, Training Loss: 0.05621429905295372, Test Loss: 0.06047787517309189\n", "Epoch 2283/3000, Training Loss: 0.05620003864169121, Test Loss: 0.060475945472717285\n", "Epoch 2284/3000, Training Loss: 0.05618578568100929, Test Loss: 0.060473911464214325\n", "Epoch 2285/3000, Training Loss: 0.05617153272032738, Test Loss: 0.060472145676612854\n", "Epoch 2286/3000, Training Loss: 0.05615730956196785, Test Loss: 0.060471005737781525\n", "Epoch 2287/3000, Training Loss: 0.056143056601285934, Test Loss: 0.060470398515462875\n", "Epoch 2288/3000, Training Loss: 0.05612882599234581, Test Loss: 0.06047020107507706\n", "Epoch 2289/3000, Training Loss: 0.056114599108695984, Test Loss: 0.060469768941402435\n", "Epoch 2290/3000, Training Loss: 0.056100402027368546, Test Loss: 0.060468897223472595\n", "Epoch 2291/3000, Training Loss: 0.056086208671331406, Test Loss: 0.060467712581157684\n", "Epoch 2292/3000, Training Loss: 0.05607202649116516, Test Loss: 0.06046631559729576\n", "Epoch 2293/3000, Training Loss: 0.05605781450867653, Test Loss: 0.06046491488814354\n", "Epoch 2294/3000, Training Loss: 0.056043677031993866, Test Loss: 0.06046362593770027\n", "Epoch 2295/3000, Training Loss: 0.05602950230240822, Test Loss: 0.06046265736222267\n", "Epoch 2296/3000, Training Loss: 0.056015364825725555, Test Loss: 0.060461971908807755\n", "Epoch 2297/3000, Training Loss: 0.05600125715136528, Test Loss: 0.06046155467629433\n", "Epoch 2298/3000, Training Loss: 0.05598708987236023, Test Loss: 0.06046116352081299\n", "Epoch 2299/3000, Training Loss: 0.055972956120967865, Test Loss: 0.0604611337184906\n", "Epoch 2300/3000, Training Loss: 0.05595884844660759, Test Loss: 0.060461144894361496\n", "Epoch 2301/3000, Training Loss: 0.055944737046957016, Test Loss: 0.06046094372868538\n", "Epoch 2302/3000, Training Loss: 0.05593062564730644, Test Loss: 0.06045981124043465\n", "Epoch 2303/3000, Training Loss: 0.05591653287410736, Test Loss: 0.060458023101091385\n", "Epoch 2304/3000, Training Loss: 0.05590246990323067, Test Loss: 0.06045645475387573\n", "Epoch 2305/3000, Training Loss: 0.055888403207063675, Test Loss: 0.06045527011156082\n", "Epoch 2306/3000, Training Loss: 0.05587432160973549, Test Loss: 0.06045417860150337\n", "Epoch 2307/3000, Training Loss: 0.05586027726531029, Test Loss: 0.06045329570770264\n", "Epoch 2308/3000, Training Loss: 0.05584624037146568, Test Loss: 0.06045253202319145\n", "Epoch 2309/3000, Training Loss: 0.055832188576459885, Test Loss: 0.060451991856098175\n", "Epoch 2310/3000, Training Loss: 0.05581815168261528, Test Loss: 0.06045151874423027\n", "Epoch 2311/3000, Training Loss: 0.05580415204167366, Test Loss: 0.06045104190707207\n", "Epoch 2312/3000, Training Loss: 0.055790144950151443, Test Loss: 0.06045082211494446\n", "Epoch 2313/3000, Training Loss: 0.055776119232177734, Test Loss: 0.06045073643326759\n", "Epoch 2314/3000, Training Loss: 0.055762145668268204, Test Loss: 0.06045011058449745\n", "Epoch 2315/3000, Training Loss: 0.055748168379068375, Test Loss: 0.06044897064566612\n", "Epoch 2316/3000, Training Loss: 0.05573415756225586, Test Loss: 0.06044746935367584\n", "Epoch 2317/3000, Training Loss: 0.055720213800668716, Test Loss: 0.06044599041342735\n", "Epoch 2318/3000, Training Loss: 0.055706240236759186, Test Loss: 0.06044511869549751\n", "Epoch 2319/3000, Training Loss: 0.05569227784872055, Test Loss: 0.06044484302401543\n", "Epoch 2320/3000, Training Loss: 0.05567837879061699, Test Loss: 0.060444578528404236\n", "Epoch 2321/3000, Training Loss: 0.055664412677288055, Test Loss: 0.0604441836476326\n", "Epoch 2322/3000, Training Loss: 0.05565052852034569, Test Loss: 0.060443583875894547\n", "Epoch 2323/3000, Training Loss: 0.05563662201166153, Test Loss: 0.060442715883255005\n", "Epoch 2324/3000, Training Loss: 0.05562269687652588, Test Loss: 0.06044170260429382\n", "Epoch 2325/3000, Training Loss: 0.05560878664255142, Test Loss: 0.06044071912765503\n", "Epoch 2326/3000, Training Loss: 0.05559488385915756, Test Loss: 0.060440197587013245\n", "Epoch 2327/3000, Training Loss: 0.05558100342750549, Test Loss: 0.060440026223659515\n", "Epoch 2328/3000, Training Loss: 0.055567122995853424, Test Loss: 0.060439687222242355\n", "Epoch 2329/3000, Training Loss: 0.05555327609181404, Test Loss: 0.06043907627463341\n", "Epoch 2330/3000, Training Loss: 0.05553940311074257, Test Loss: 0.060438208281993866\n", "Epoch 2331/3000, Training Loss: 0.05552559718489647, Test Loss: 0.06043713167309761\n", "Epoch 2332/3000, Training Loss: 0.05551173537969589, Test Loss: 0.06043610721826553\n", "Epoch 2333/3000, Training Loss: 0.05549788102507591, Test Loss: 0.060435160994529724\n", "Epoch 2334/3000, Training Loss: 0.05548406019806862, Test Loss: 0.06043487414717674\n", "Epoch 2335/3000, Training Loss: 0.05547025054693222, Test Loss: 0.06043499708175659\n", "Epoch 2336/3000, Training Loss: 0.05545644089579582, Test Loss: 0.060434892773628235\n", "Epoch 2337/3000, Training Loss: 0.05544265732169151, Test Loss: 0.060434386134147644\n", "Epoch 2338/3000, Training Loss: 0.05542886629700661, Test Loss: 0.06043350696563721\n", "Epoch 2339/3000, Training Loss: 0.0554150715470314, Test Loss: 0.060432396829128265\n", "Epoch 2340/3000, Training Loss: 0.05540131404995918, Test Loss: 0.060431238263845444\n", "Epoch 2341/3000, Training Loss: 0.05538754165172577, Test Loss: 0.06043017655611038\n", "Epoch 2342/3000, Training Loss: 0.055373769253492355, Test Loss: 0.060429465025663376\n", "Epoch 2343/3000, Training Loss: 0.055360037833452225, Test Loss: 0.06042904034256935\n", "Epoch 2344/3000, Training Loss: 0.05534631013870239, Test Loss: 0.060429222881793976\n", "Epoch 2345/3000, Training Loss: 0.055332571268081665, Test Loss: 0.060429323464632034\n", "Epoch 2346/3000, Training Loss: 0.05531885102391243, Test Loss: 0.06042908504605293\n", "Epoch 2347/3000, Training Loss: 0.055305130779743195, Test Loss: 0.060428522527217865\n", "Epoch 2348/3000, Training Loss: 0.05529142916202545, Test Loss: 0.06042763218283653\n", "Epoch 2349/3000, Training Loss: 0.055277734994888306, Test Loss: 0.060426972806453705\n", "Epoch 2350/3000, Training Loss: 0.05526401475071907, Test Loss: 0.0604264959692955\n", "Epoch 2351/3000, Training Loss: 0.05525031313300133, Test Loss: 0.060425810515880585\n", "Epoch 2352/3000, Training Loss: 0.05523662641644478, Test Loss: 0.060425009578466415\n", "Epoch 2353/3000, Training Loss: 0.05522296950221062, Test Loss: 0.060424186289310455\n", "Epoch 2354/3000, Training Loss: 0.055209286510944366, Test Loss: 0.06042345613241196\n", "Epoch 2355/3000, Training Loss: 0.055195655673742294, Test Loss: 0.060422852635383606\n", "Epoch 2356/3000, Training Loss: 0.05518200621008873, Test Loss: 0.06042235717177391\n", "Epoch 2357/3000, Training Loss: 0.05516839399933815, Test Loss: 0.06042208522558212\n", "Epoch 2358/3000, Training Loss: 0.05515472590923309, Test Loss: 0.06042184680700302\n", "Epoch 2359/3000, Training Loss: 0.0551411509513855, Test Loss: 0.06042154133319855\n", "Epoch 2360/3000, Training Loss: 0.05512750521302223, Test Loss: 0.06042119860649109\n", "Epoch 2361/3000, Training Loss: 0.05511391907930374, Test Loss: 0.06042061001062393\n", "Epoch 2362/3000, Training Loss: 0.05510034039616585, Test Loss: 0.06041998416185379\n", "Epoch 2363/3000, Training Loss: 0.055086731910705566, Test Loss: 0.060419317334890366\n", "Epoch 2364/3000, Training Loss: 0.055073145776987076, Test Loss: 0.060418713837862015\n", "Epoch 2365/3000, Training Loss: 0.055059533566236496, Test Loss: 0.060418568551540375\n", "Epoch 2366/3000, Training Loss: 0.05504598096013069, Test Loss: 0.06041829288005829\n", "Epoch 2367/3000, Training Loss: 0.05503242462873459, Test Loss: 0.06041790172457695\n", "Epoch 2368/3000, Training Loss: 0.05501887574791908, Test Loss: 0.06041740998625755\n", "Epoch 2369/3000, Training Loss: 0.05500528961420059, Test Loss: 0.0604168176651001\n", "Epoch 2370/3000, Training Loss: 0.05499180406332016, Test Loss: 0.060416121035814285\n", "Epoch 2371/3000, Training Loss: 0.05497826263308525, Test Loss: 0.06041588634252548\n", "Epoch 2372/3000, Training Loss: 0.054964739829301834, Test Loss: 0.060415592044591904\n", "Epoch 2373/3000, Training Loss: 0.05495123565196991, Test Loss: 0.06041516736149788\n", "Epoch 2374/3000, Training Loss: 0.05493772029876709, Test Loss: 0.06041474640369415\n", "Epoch 2375/3000, Training Loss: 0.054924238473176956, Test Loss: 0.06041417270898819\n", "Epoch 2376/3000, Training Loss: 0.05491074174642563, Test Loss: 0.060413673520088196\n", "Epoch 2377/3000, Training Loss: 0.0548972524702549, Test Loss: 0.06041312590241432\n", "Epoch 2378/3000, Training Loss: 0.05488375201821327, Test Loss: 0.060412678867578506\n", "Epoch 2379/3000, Training Loss: 0.05487029254436493, Test Loss: 0.06041229888796806\n", "Epoch 2380/3000, Training Loss: 0.05485684052109718, Test Loss: 0.06041199341416359\n", "Epoch 2381/3000, Training Loss: 0.05484338849782944, Test Loss: 0.060411758720874786\n", "Epoch 2382/3000, Training Loss: 0.05482995882630348, Test Loss: 0.06041121855378151\n", "Epoch 2383/3000, Training Loss: 0.054816488176584244, Test Loss: 0.06041057035326958\n", "Epoch 2384/3000, Training Loss: 0.05480307713150978, Test Loss: 0.0604097954928875\n", "Epoch 2385/3000, Training Loss: 0.05478965863585472, Test Loss: 0.06040915101766586\n", "Epoch 2386/3000, Training Loss: 0.05477624386548996, Test Loss: 0.06040872633457184\n", "Epoch 2387/3000, Training Loss: 0.054762814193964005, Test Loss: 0.06040842831134796\n", "Epoch 2388/3000, Training Loss: 0.05474945530295372, Test Loss: 0.060408297926187515\n", "Epoch 2389/3000, Training Loss: 0.05473604053258896, Test Loss: 0.060408223420381546\n", "Epoch 2390/3000, Training Loss: 0.05472265183925629, Test Loss: 0.06040812283754349\n", "Epoch 2391/3000, Training Loss: 0.05470927059650421, Test Loss: 0.0604078471660614\n", "Epoch 2392/3000, Training Loss: 0.05469591170549393, Test Loss: 0.060407474637031555\n", "Epoch 2393/3000, Training Loss: 0.05468255281448364, Test Loss: 0.06040700152516365\n", "Epoch 2394/3000, Training Loss: 0.05466921627521515, Test Loss: 0.06040648743510246\n", "Epoch 2395/3000, Training Loss: 0.054655831307172775, Test Loss: 0.06040595844388008\n", "Epoch 2396/3000, Training Loss: 0.05464249849319458, Test Loss: 0.06040549650788307\n", "Epoch 2397/3000, Training Loss: 0.05462917312979698, Test Loss: 0.06040523573756218\n", "Epoch 2398/3000, Training Loss: 0.05461584031581879, Test Loss: 0.06040509045124054\n", "Epoch 2399/3000, Training Loss: 0.054602526128292084, Test Loss: 0.06040498614311218\n", "Epoch 2400/3000, Training Loss: 0.05458919703960419, Test Loss: 0.06040484085679054\n", "Epoch 2401/3000, Training Loss: 0.05457592010498047, Test Loss: 0.060404613614082336\n", "Epoch 2402/3000, Training Loss: 0.05456260219216347, Test Loss: 0.06040430814027786\n", "Epoch 2403/3000, Training Loss: 0.054549314081668854, Test Loss: 0.0604039691388607\n", "Epoch 2404/3000, Training Loss: 0.05453602597117424, Test Loss: 0.06040350720286369\n", "Epoch 2405/3000, Training Loss: 0.054522786289453506, Test Loss: 0.06040303409099579\n", "Epoch 2406/3000, Training Loss: 0.05450949817895889, Test Loss: 0.06040270999073982\n", "Epoch 2407/3000, Training Loss: 0.05449622496962547, Test Loss: 0.06040245667099953\n", "Epoch 2408/3000, Training Loss: 0.054482992738485336, Test Loss: 0.06040225923061371\n", "Epoch 2409/3000, Training Loss: 0.05446973815560341, Test Loss: 0.060402143746614456\n", "Epoch 2410/3000, Training Loss: 0.05445651337504387, Test Loss: 0.060402095317840576\n", "Epoch 2411/3000, Training Loss: 0.05444326251745224, Test Loss: 0.060401905328035355\n", "Epoch 2412/3000, Training Loss: 0.05443001538515091, Test Loss: 0.06040161848068237\n", "Epoch 2413/3000, Training Loss: 0.05441680923104286, Test Loss: 0.06040121614933014\n", "Epoch 2414/3000, Training Loss: 0.05440361052751541, Test Loss: 0.060400888323783875\n", "Epoch 2415/3000, Training Loss: 0.05439039692282677, Test Loss: 0.06040056794881821\n", "Epoch 2416/3000, Training Loss: 0.054377198219299316, Test Loss: 0.06040027737617493\n", "Epoch 2417/3000, Training Loss: 0.05436402186751366, Test Loss: 0.06040005385875702\n", "Epoch 2418/3000, Training Loss: 0.0543508417904377, Test Loss: 0.06039991229772568\n", "Epoch 2419/3000, Training Loss: 0.054337628185749054, Test Loss: 0.06039976701140404\n", "Epoch 2420/3000, Training Loss: 0.05432448163628578, Test Loss: 0.06039958447217941\n", "Epoch 2421/3000, Training Loss: 0.054311320185661316, Test Loss: 0.06039942428469658\n", "Epoch 2422/3000, Training Loss: 0.05429817736148834, Test Loss: 0.06039922684431076\n", "Epoch 2423/3000, Training Loss: 0.05428497865796089, Test Loss: 0.060398999601602554\n", "Epoch 2424/3000, Training Loss: 0.054271869361400604, Test Loss: 0.060398805886507034\n", "Epoch 2425/3000, Training Loss: 0.05425872653722763, Test Loss: 0.06039852648973465\n", "Epoch 2426/3000, Training Loss: 0.05424561724066734, Test Loss: 0.060398273169994354\n", "Epoch 2427/3000, Training Loss: 0.054232463240623474, Test Loss: 0.06039804965257645\n", "Epoch 2428/3000, Training Loss: 0.054219361394643784, Test Loss: 0.06039787828922272\n", "Epoch 2429/3000, Training Loss: 0.05420626327395439, Test Loss: 0.06039777025580406\n", "Epoch 2430/3000, Training Loss: 0.054193127900362015, Test Loss: 0.060397643595933914\n", "Epoch 2431/3000, Training Loss: 0.054180048406124115, Test Loss: 0.06039752811193466\n", "Epoch 2432/3000, Training Loss: 0.05416696518659592, Test Loss: 0.060397371649742126\n", "Epoch 2433/3000, Training Loss: 0.054153867065906525, Test Loss: 0.06039724871516228\n", "Epoch 2434/3000, Training Loss: 0.05414077639579773, Test Loss: 0.060397014021873474\n", "Epoch 2435/3000, Training Loss: 0.05412772670388222, Test Loss: 0.060396816581487656\n", "Epoch 2436/3000, Training Loss: 0.0541146844625473, Test Loss: 0.06039661541581154\n", "Epoch 2437/3000, Training Loss: 0.05410163104534149, Test Loss: 0.0603964738547802\n", "Epoch 2438/3000, Training Loss: 0.05408855900168419, Test Loss: 0.06039630249142647\n", "Epoch 2439/3000, Training Loss: 0.05407552048563957, Test Loss: 0.06039626523852348\n", "Epoch 2440/3000, Training Loss: 0.05406250059604645, Test Loss: 0.06039618328213692\n", "Epoch 2441/3000, Training Loss: 0.05404946580529213, Test Loss: 0.06039610505104065\n", "Epoch 2442/3000, Training Loss: 0.054036419838666916, Test Loss: 0.06039606034755707\n", "Epoch 2443/3000, Training Loss: 0.05402342975139618, Test Loss: 0.06039588898420334\n", "Epoch 2444/3000, Training Loss: 0.05401042476296425, Test Loss: 0.06039576977491379\n", "Epoch 2445/3000, Training Loss: 0.053997382521629333, Test Loss: 0.06039557233452797\n", "Epoch 2446/3000, Training Loss: 0.053984399884939194, Test Loss: 0.060395464301109314\n", "Epoch 2447/3000, Training Loss: 0.053971417248249054, Test Loss: 0.06039535999298096\n", "Epoch 2448/3000, Training Loss: 0.05395842716097832, Test Loss: 0.060395266860723495\n", "Epoch 2449/3000, Training Loss: 0.053945451974868774, Test Loss: 0.06039520725607872\n", "Epoch 2450/3000, Training Loss: 0.05393248423933983, Test Loss: 0.060395143926143646\n", "Epoch 2451/3000, Training Loss: 0.05391950532793999, Test Loss: 0.060395125299692154\n", "Epoch 2452/3000, Training Loss: 0.05390656739473343, Test Loss: 0.060395076870918274\n", "Epoch 2453/3000, Training Loss: 0.05389359965920448, Test Loss: 0.0603950060904026\n", "Epoch 2454/3000, Training Loss: 0.05388067290186882, Test Loss: 0.06039494276046753\n", "Epoch 2455/3000, Training Loss: 0.05386771634221077, Test Loss: 0.060394808650016785\n", "Epoch 2456/3000, Training Loss: 0.05385478958487511, Test Loss: 0.060394737869501114\n", "Epoch 2457/3000, Training Loss: 0.053841859102249146, Test Loss: 0.06039466708898544\n", "Epoch 2458/3000, Training Loss: 0.053828950971364975, Test Loss: 0.060394637286663055\n", "Epoch 2459/3000, Training Loss: 0.05381603166460991, Test Loss: 0.06039460375905037\n", "Epoch 2460/3000, Training Loss: 0.053803108632564545, Test Loss: 0.06039460375905037\n", "Epoch 2461/3000, Training Loss: 0.05379025265574455, Test Loss: 0.06039464846253395\n", "Epoch 2462/3000, Training Loss: 0.0537773072719574, Test Loss: 0.060394637286663055\n", "Epoch 2463/3000, Training Loss: 0.05376444384455681, Test Loss: 0.06039464846253395\n", "Epoch 2464/3000, Training Loss: 0.053751539438962936, Test Loss: 0.060394592583179474\n", "Epoch 2465/3000, Training Loss: 0.05373867601156235, Test Loss: 0.06039462238550186\n", "Epoch 2466/3000, Training Loss: 0.053725820034742355, Test Loss: 0.06039460003376007\n", "Epoch 2467/3000, Training Loss: 0.053712934255599976, Test Loss: 0.06039454787969589\n", "Epoch 2468/3000, Training Loss: 0.05370010435581207, Test Loss: 0.06039450317621231\n", "Epoch 2469/3000, Training Loss: 0.05368722975254059, Test Loss: 0.0603945292532444\n", "Epoch 2470/3000, Training Loss: 0.05367438495159149, Test Loss: 0.06039455533027649\n", "Epoch 2471/3000, Training Loss: 0.05366156995296478, Test Loss: 0.06039462238550186\n", "Epoch 2472/3000, Training Loss: 0.0536486990749836, Test Loss: 0.06039465591311455\n", "Epoch 2473/3000, Training Loss: 0.053635891526937485, Test Loss: 0.060394637286663055\n", "Epoch 2474/3000, Training Loss: 0.053623080253601074, Test Loss: 0.060394663363695145\n", "Epoch 2475/3000, Training Loss: 0.05361025407910347, Test Loss: 0.060394663363695145\n", "Epoch 2476/3000, Training Loss: 0.053597450256347656, Test Loss: 0.060394708067178726\n", "Epoch 2477/3000, Training Loss: 0.05358463153243065, Test Loss: 0.06039484590291977\n", "Epoch 2478/3000, Training Loss: 0.05357186123728752, Test Loss: 0.06039490923285484\n", "Epoch 2479/3000, Training Loss: 0.05355904996395111, Test Loss: 0.06039492413401604\n", "Epoch 2480/3000, Training Loss: 0.05354629456996918, Test Loss: 0.060394980013370514\n", "Epoch 2481/3000, Training Loss: 0.05353351682424545, Test Loss: 0.060394980013370514\n", "Epoch 2482/3000, Training Loss: 0.053520724177360535, Test Loss: 0.06039503589272499\n", "Epoch 2483/3000, Training Loss: 0.05350795015692711, Test Loss: 0.060395143926143646\n", "Epoch 2484/3000, Training Loss: 0.05349518358707428, Test Loss: 0.060395222157239914\n", "Epoch 2485/3000, Training Loss: 0.053482431918382645, Test Loss: 0.06039531156420708\n", "Epoch 2486/3000, Training Loss: 0.053469691425561905, Test Loss: 0.06039540469646454\n", "Epoch 2487/3000, Training Loss: 0.05345696210861206, Test Loss: 0.060395512729883194\n", "Epoch 2488/3000, Training Loss: 0.053444214165210724, Test Loss: 0.06039560213685036\n", "Epoch 2489/3000, Training Loss: 0.053431492298841476, Test Loss: 0.06039571762084961\n", "Epoch 2490/3000, Training Loss: 0.053418777883052826, Test Loss: 0.06039583683013916\n", "Epoch 2491/3000, Training Loss: 0.05340607464313507, Test Loss: 0.060395900160074234\n", "Epoch 2492/3000, Training Loss: 0.053393326699733734, Test Loss: 0.060395970940589905\n", "Epoch 2493/3000, Training Loss: 0.053380630910396576, Test Loss: 0.06039608642458916\n", "Epoch 2494/3000, Training Loss: 0.053367938846349716, Test Loss: 0.06039620190858841\n", "Epoch 2495/3000, Training Loss: 0.053355228155851364, Test Loss: 0.06039632856845856\n", "Epoch 2496/3000, Training Loss: 0.0533425509929657, Test Loss: 0.060396481305360794\n", "Epoch 2497/3000, Training Loss: 0.05332987383008003, Test Loss: 0.06039665266871452\n", "Epoch 2498/3000, Training Loss: 0.05331717059016228, Test Loss: 0.06039681285619736\n", "Epoch 2499/3000, Training Loss: 0.053304530680179596, Test Loss: 0.06039690598845482\n", "Epoch 2500/3000, Training Loss: 0.05329183116555214, Test Loss: 0.06039686128497124\n", "Epoch 2501/3000, Training Loss: 0.05327921733260155, Test Loss: 0.060396816581487656\n", "Epoch 2502/3000, Training Loss: 0.05326653644442558, Test Loss: 0.06039689853787422\n", "Epoch 2503/3000, Training Loss: 0.05325392261147499, Test Loss: 0.06039702892303467\n", "Epoch 2504/3000, Training Loss: 0.05324126407504082, Test Loss: 0.06039724498987198\n", "Epoch 2505/3000, Training Loss: 0.05322863534092903, Test Loss: 0.06039757281541824\n", "Epoch 2506/3000, Training Loss: 0.05321601405739784, Test Loss: 0.06039787456393242\n", "Epoch 2507/3000, Training Loss: 0.05320340767502785, Test Loss: 0.060398221015930176\n", "Epoch 2508/3000, Training Loss: 0.05319078266620636, Test Loss: 0.060398370027542114\n", "Epoch 2509/3000, Training Loss: 0.053178176283836365, Test Loss: 0.06039850786328316\n", "Epoch 2510/3000, Training Loss: 0.05316557362675667, Test Loss: 0.06039860472083092\n", "Epoch 2511/3000, Training Loss: 0.05315297469496727, Test Loss: 0.060398705303668976\n", "Epoch 2512/3000, Training Loss: 0.053140368312597275, Test Loss: 0.0603988915681839\n", "Epoch 2513/3000, Training Loss: 0.05312778428196907, Test Loss: 0.06039903685450554\n", "Epoch 2514/3000, Training Loss: 0.053115226328372955, Test Loss: 0.060399290174245834\n", "Epoch 2515/3000, Training Loss: 0.05310261622071266, Test Loss: 0.06039939820766449\n", "Epoch 2516/3000, Training Loss: 0.05309008061885834, Test Loss: 0.060399457812309265\n", "Epoch 2517/3000, Training Loss: 0.053077515214681625, Test Loss: 0.060399506241083145\n", "Epoch 2518/3000, Training Loss: 0.05306493863463402, Test Loss: 0.06039956584572792\n", "Epoch 2519/3000, Training Loss: 0.0530523918569088, Test Loss: 0.06039980426430702\n", "Epoch 2520/3000, Training Loss: 0.053039856255054474, Test Loss: 0.06040016934275627\n", "Epoch 2521/3000, Training Loss: 0.05302732065320015, Test Loss: 0.06040049344301224\n", "Epoch 2522/3000, Training Loss: 0.05301478132605553, Test Loss: 0.06040087342262268\n", "Epoch 2523/3000, Training Loss: 0.053002238273620605, Test Loss: 0.06040124222636223\n", "Epoch 2524/3000, Training Loss: 0.05298972129821777, Test Loss: 0.06040148437023163\n", "Epoch 2525/3000, Training Loss: 0.052977193146944046, Test Loss: 0.06040172651410103\n", "Epoch 2526/3000, Training Loss: 0.052964720875024796, Test Loss: 0.060401879251003265\n", "Epoch 2527/3000, Training Loss: 0.05295216292142868, Test Loss: 0.0604020319879055\n", "Epoch 2528/3000, Training Loss: 0.05293968692421913, Test Loss: 0.06040211394429207\n", "Epoch 2529/3000, Training Loss: 0.052927181124687195, Test Loss: 0.06040229648351669\n", "Epoch 2530/3000, Training Loss: 0.05291469395160675, Test Loss: 0.06040266156196594\n", "Epoch 2531/3000, Training Loss: 0.052902232855558395, Test Loss: 0.06040304899215698\n", "Epoch 2532/3000, Training Loss: 0.05288971588015556, Test Loss: 0.06040344387292862\n", "Epoch 2533/3000, Training Loss: 0.052877262234687805, Test Loss: 0.060403816401958466\n", "Epoch 2534/3000, Training Loss: 0.05286481976509094, Test Loss: 0.06040414795279503\n", "Epoch 2535/3000, Training Loss: 0.05285234376788139, Test Loss: 0.060404419898986816\n", "Epoch 2536/3000, Training Loss: 0.052839912474155426, Test Loss: 0.06040465086698532\n", "Epoch 2537/3000, Training Loss: 0.05282740667462349, Test Loss: 0.06040474399924278\n", "Epoch 2538/3000, Training Loss: 0.05281498283147812, Test Loss: 0.06040499359369278\n", "Epoch 2539/3000, Training Loss: 0.052802540361881256, Test Loss: 0.0604049414396286\n", "Epoch 2540/3000, Training Loss: 0.05279012396931648, Test Loss: 0.06040491908788681\n", "Epoch 2541/3000, Training Loss: 0.05277765169739723, Test Loss: 0.06040484085679054\n", "Epoch 2542/3000, Training Loss: 0.052765268832445145, Test Loss: 0.06040506437420845\n", "Epoch 2543/3000, Training Loss: 0.05275283008813858, Test Loss: 0.06040548160672188\n", "Epoch 2544/3000, Training Loss: 0.052740417420864105, Test Loss: 0.06040599197149277\n", "Epoch 2545/3000, Training Loss: 0.05272801220417023, Test Loss: 0.060406602919101715\n", "Epoch 2546/3000, Training Loss: 0.05271559953689575, Test Loss: 0.06040715053677559\n", "Epoch 2547/3000, Training Loss: 0.05270320922136307, Test Loss: 0.060407619923353195\n", "Epoch 2548/3000, Training Loss: 0.05269083008170128, Test Loss: 0.06040782108902931\n", "Epoch 2549/3000, Training Loss: 0.05267846956849098, Test Loss: 0.060407910495996475\n", "Epoch 2550/3000, Training Loss: 0.0526660792529583, Test Loss: 0.060407936573028564\n", "Epoch 2551/3000, Training Loss: 0.05265370011329651, Test Loss: 0.060408107936382294\n", "Epoch 2552/3000, Training Loss: 0.05264131724834442, Test Loss: 0.06040835753083229\n", "Epoch 2553/3000, Training Loss: 0.05262894555926323, Test Loss: 0.06040876358747482\n", "Epoch 2554/3000, Training Loss: 0.05261657014489174, Test Loss: 0.06040928512811661\n", "Epoch 2555/3000, Training Loss: 0.05260423570871353, Test Loss: 0.06040985882282257\n", "Epoch 2556/3000, Training Loss: 0.05259189382195473, Test Loss: 0.060410380363464355\n", "Epoch 2557/3000, Training Loss: 0.052579548209905624, Test Loss: 0.0604107491672039\n", "Epoch 2558/3000, Training Loss: 0.05256720632314682, Test Loss: 0.060411058366298676\n", "Epoch 2559/3000, Training Loss: 0.05255487933754921, Test Loss: 0.0604112446308136\n", "Epoch 2560/3000, Training Loss: 0.052542544901371, Test Loss: 0.060411401093006134\n", "Epoch 2561/3000, Training Loss: 0.052530206739902496, Test Loss: 0.06041159853339195\n", "Epoch 2562/3000, Training Loss: 0.05251789838075638, Test Loss: 0.06041193753480911\n", "Epoch 2563/3000, Training Loss: 0.05250559374690056, Test Loss: 0.06041235476732254\n", "Epoch 2564/3000, Training Loss: 0.05249328538775444, Test Loss: 0.06041295826435089\n", "Epoch 2565/3000, Training Loss: 0.05248098820447922, Test Loss: 0.06041378155350685\n", "Epoch 2566/3000, Training Loss: 0.052468717098236084, Test Loss: 0.0604143887758255\n", "Epoch 2567/3000, Training Loss: 0.05245643109083176, Test Loss: 0.06041499599814415\n", "Epoch 2568/3000, Training Loss: 0.05244408920407295, Test Loss: 0.06041530519723892\n", "Epoch 2569/3000, Training Loss: 0.05243181064724922, Test Loss: 0.06041562929749489\n", "Epoch 2570/3000, Training Loss: 0.05241952836513519, Test Loss: 0.06041599437594414\n", "Epoch 2571/3000, Training Loss: 0.05240728706121445, Test Loss: 0.060416482388973236\n", "Epoch 2572/3000, Training Loss: 0.052395015954971313, Test Loss: 0.060416992753744125\n", "Epoch 2573/3000, Training Loss: 0.05238275229930878, Test Loss: 0.06041744723916054\n", "Epoch 2574/3000, Training Loss: 0.05237049236893654, Test Loss: 0.06041782349348068\n", "Epoch 2575/3000, Training Loss: 0.0523582361638546, Test Loss: 0.06041812151670456\n", "Epoch 2576/3000, Training Loss: 0.05234600231051445, Test Loss: 0.06041842699050903\n", "Epoch 2577/3000, Training Loss: 0.05233374983072281, Test Loss: 0.060418713837862015\n", "Epoch 2578/3000, Training Loss: 0.05232150852680206, Test Loss: 0.06041901931166649\n", "Epoch 2579/3000, Training Loss: 0.052309293299913406, Test Loss: 0.06041938066482544\n", "Epoch 2580/3000, Training Loss: 0.05229705944657326, Test Loss: 0.060419853776693344\n", "Epoch 2581/3000, Training Loss: 0.05228481814265251, Test Loss: 0.06042030453681946\n", "Epoch 2582/3000, Training Loss: 0.052272599190473557, Test Loss: 0.06042075529694557\n", "Epoch 2583/3000, Training Loss: 0.05226041376590729, Test Loss: 0.06042126193642616\n", "Epoch 2584/3000, Training Loss: 0.052248191088438034, Test Loss: 0.0604216642677784\n", "Epoch 2585/3000, Training Loss: 0.052235979586839676, Test Loss: 0.060421936213970184\n", "Epoch 2586/3000, Training Loss: 0.05222379043698311, Test Loss: 0.06042228639125824\n", "Epoch 2587/3000, Training Loss: 0.052211612462997437, Test Loss: 0.060422707349061966\n", "Epoch 2588/3000, Training Loss: 0.05219939723610878, Test Loss: 0.060423146933317184\n", "Epoch 2589/3000, Training Loss: 0.05218721181154251, Test Loss: 0.06042364239692688\n", "Epoch 2590/3000, Training Loss: 0.052175071090459824, Test Loss: 0.060424111783504486\n", "Epoch 2591/3000, Training Loss: 0.052162885665893555, Test Loss: 0.06042455509305\n", "Epoch 2592/3000, Training Loss: 0.05215071886777878, Test Loss: 0.06042495742440224\n", "Epoch 2593/3000, Training Loss: 0.05213857814669609, Test Loss: 0.060425400733947754\n", "Epoch 2594/3000, Training Loss: 0.052126381546258926, Test Loss: 0.060425810515880585\n", "Epoch 2595/3000, Training Loss: 0.05211425572633743, Test Loss: 0.06042629852890968\n", "Epoch 2596/3000, Training Loss: 0.052102118730545044, Test Loss: 0.06042678654193878\n", "Epoch 2597/3000, Training Loss: 0.05208994075655937, Test Loss: 0.06042727082967758\n", "Epoch 2598/3000, Training Loss: 0.052077800035476685, Test Loss: 0.06042778491973877\n", "Epoch 2599/3000, Training Loss: 0.05206567049026489, Test Loss: 0.06042824313044548\n", "Epoch 2600/3000, Training Loss: 0.05205357074737549, Test Loss: 0.060428690165281296\n", "Epoch 2601/3000, Training Loss: 0.052041418850421906, Test Loss: 0.0604291707277298\n", "Epoch 2602/3000, Training Loss: 0.05202929675579071, Test Loss: 0.06042968109250069\n", "Epoch 2603/3000, Training Loss: 0.05201718211174011, Test Loss: 0.06043021380901337\n", "Epoch 2604/3000, Training Loss: 0.05200508236885071, Test Loss: 0.06043071672320366\n", "Epoch 2605/3000, Training Loss: 0.051992978900671005, Test Loss: 0.06043129786849022\n", "Epoch 2606/3000, Training Loss: 0.05198086053133011, Test Loss: 0.06043181195855141\n", "Epoch 2607/3000, Training Loss: 0.0519687794148922, Test Loss: 0.06043233349919319\n", "Epoch 2608/3000, Training Loss: 0.051956698298454285, Test Loss: 0.0604327954351902\n", "Epoch 2609/3000, Training Loss: 0.051944587379693985, Test Loss: 0.060433272272348404\n", "Epoch 2610/3000, Training Loss: 0.051932502537965775, Test Loss: 0.06043381243944168\n", "Epoch 2611/3000, Training Loss: 0.05192043259739876, Test Loss: 0.06043442338705063\n", "Epoch 2612/3000, Training Loss: 0.051908351480960846, Test Loss: 0.0604349710047245\n", "Epoch 2613/3000, Training Loss: 0.05189628526568413, Test Loss: 0.060435544699430466\n", "Epoch 2614/3000, Training Loss: 0.05188421532511711, Test Loss: 0.06043610721826553\n", "Epoch 2615/3000, Training Loss: 0.05187218263745308, Test Loss: 0.060436636209487915\n", "Epoch 2616/3000, Training Loss: 0.05186011642217636, Test Loss: 0.06043720245361328\n", "Epoch 2617/3000, Training Loss: 0.05184805393218994, Test Loss: 0.06043773144483566\n", "Epoch 2618/3000, Training Loss: 0.051836028695106506, Test Loss: 0.060438353568315506\n", "Epoch 2619/3000, Training Loss: 0.05182397738099098, Test Loss: 0.06043890118598938\n", "Epoch 2620/3000, Training Loss: 0.05181194096803665, Test Loss: 0.06043951213359833\n", "Epoch 2621/3000, Training Loss: 0.05179990828037262, Test Loss: 0.060440126806497574\n", "Epoch 2622/3000, Training Loss: 0.051787879317998886, Test Loss: 0.06044073775410652\n", "Epoch 2623/3000, Training Loss: 0.05177585408091545, Test Loss: 0.06044134125113487\n", "Epoch 2624/3000, Training Loss: 0.05176384374499321, Test Loss: 0.06044190004467964\n", "Epoch 2625/3000, Training Loss: 0.051751840859651566, Test Loss: 0.06044251099228859\n", "Epoch 2626/3000, Training Loss: 0.05173981189727783, Test Loss: 0.060443174093961716\n", "Epoch 2627/3000, Training Loss: 0.05172782018780708, Test Loss: 0.06044382601976395\n", "Epoch 2628/3000, Training Loss: 0.05171580612659454, Test Loss: 0.060444463044404984\n", "Epoch 2629/3000, Training Loss: 0.0517037957906723, Test Loss: 0.06044505909085274\n", "Epoch 2630/3000, Training Loss: 0.05169183388352394, Test Loss: 0.060445625334978104\n", "Epoch 2631/3000, Training Loss: 0.05167985334992409, Test Loss: 0.060446225106716156\n", "Epoch 2632/3000, Training Loss: 0.05166788771748543, Test Loss: 0.060446880757808685\n", "Epoch 2633/3000, Training Loss: 0.051655881106853485, Test Loss: 0.06044759228825569\n", "Epoch 2634/3000, Training Loss: 0.05164392292499542, Test Loss: 0.06044824421405792\n", "Epoch 2635/3000, Training Loss: 0.05163193494081497, Test Loss: 0.060448918491601944\n", "Epoch 2636/3000, Training Loss: 0.051620010286569595, Test Loss: 0.06044955551624298\n", "Epoch 2637/3000, Training Loss: 0.05160801485180855, Test Loss: 0.060450438410043716\n", "Epoch 2638/3000, Training Loss: 0.05159606784582138, Test Loss: 0.060451406985521317\n", "Epoch 2639/3000, Training Loss: 0.051584117114543915, Test Loss: 0.06045236065983772\n", "Epoch 2640/3000, Training Loss: 0.05157215893268585, Test Loss: 0.060453083366155624\n", "Epoch 2641/3000, Training Loss: 0.05156021937727928, Test Loss: 0.06045365706086159\n", "Epoch 2642/3000, Training Loss: 0.051548294723033905, Test Loss: 0.06045413389801979\n", "Epoch 2643/3000, Training Loss: 0.051536377519369125, Test Loss: 0.060454558581113815\n", "Epoch 2644/3000, Training Loss: 0.05152442678809166, Test Loss: 0.06045512482523918\n", "Epoch 2645/3000, Training Loss: 0.05151249095797539, Test Loss: 0.0604558102786541\n", "Epoch 2646/3000, Training Loss: 0.0515005886554718, Test Loss: 0.06045660749077797\n", "Epoch 2647/3000, Training Loss: 0.05148867517709732, Test Loss: 0.060457419604063034\n", "Epoch 2648/3000, Training Loss: 0.051476750522851944, Test Loss: 0.06045828014612198\n", "Epoch 2649/3000, Training Loss: 0.05146487057209015, Test Loss: 0.06045908108353615\n", "Epoch 2650/3000, Training Loss: 0.05145295709371567, Test Loss: 0.06045978516340256\n", "Epoch 2651/3000, Training Loss: 0.05144102871417999, Test Loss: 0.060460396111011505\n", "Epoch 2652/3000, Training Loss: 0.051429156213998795, Test Loss: 0.06046094372868538\n", "Epoch 2653/3000, Training Loss: 0.051417265087366104, Test Loss: 0.060461483895778656\n", "Epoch 2654/3000, Training Loss: 0.05140536651015282, Test Loss: 0.060462143272161484\n", "Epoch 2655/3000, Training Loss: 0.05139349400997162, Test Loss: 0.06046295166015625\n", "Epoch 2656/3000, Training Loss: 0.05138162896037102, Test Loss: 0.06046384200453758\n", "Epoch 2657/3000, Training Loss: 0.051369745284318924, Test Loss: 0.060464661568403244\n", "Epoch 2658/3000, Training Loss: 0.05135788768529892, Test Loss: 0.06046544387936592\n", "Epoch 2659/3000, Training Loss: 0.051346007734537125, Test Loss: 0.060466218739748\n", "Epoch 2660/3000, Training Loss: 0.05133417993783951, Test Loss: 0.06046688184142113\n", "Epoch 2661/3000, Training Loss: 0.05132228508591652, Test Loss: 0.06046755611896515\n", "Epoch 2662/3000, Training Loss: 0.05131044238805771, Test Loss: 0.06046823039650917\n", "Epoch 2663/3000, Training Loss: 0.051298607140779495, Test Loss: 0.06046899035573006\n", "Epoch 2664/3000, Training Loss: 0.05128674581646919, Test Loss: 0.06046973541378975\n", "Epoch 2665/3000, Training Loss: 0.05127492919564247, Test Loss: 0.060470592230558395\n", "Epoch 2666/3000, Training Loss: 0.05126309394836426, Test Loss: 0.06047135591506958\n", "Epoch 2667/3000, Training Loss: 0.05125124379992485, Test Loss: 0.06047222763299942\n", "Epoch 2668/3000, Training Loss: 0.05123945325613022, Test Loss: 0.06047297641634941\n", "Epoch 2669/3000, Training Loss: 0.05122761055827141, Test Loss: 0.06047380343079567\n", "Epoch 2670/3000, Training Loss: 0.051215797662734985, Test Loss: 0.06047453731298447\n", "Epoch 2671/3000, Training Loss: 0.05120399221777916, Test Loss: 0.06047528609633446\n", "Epoch 2672/3000, Training Loss: 0.051192160695791245, Test Loss: 0.060476161539554596\n", "Epoch 2673/3000, Training Loss: 0.05118037760257721, Test Loss: 0.06047697737812996\n", "Epoch 2674/3000, Training Loss: 0.05116858333349228, Test Loss: 0.060477789491415024\n", "Epoch 2675/3000, Training Loss: 0.051156774163246155, Test Loss: 0.0604785792529583\n", "Epoch 2676/3000, Training Loss: 0.051144976168870926, Test Loss: 0.060479387640953064\n", "Epoch 2677/3000, Training Loss: 0.051133181899785995, Test Loss: 0.06048019975423813\n", "Epoch 2678/3000, Training Loss: 0.05112142488360405, Test Loss: 0.06048106029629707\n", "Epoch 2679/3000, Training Loss: 0.051109619438648224, Test Loss: 0.060481879860162735\n", "Epoch 2680/3000, Training Loss: 0.05109785497188568, Test Loss: 0.060482680797576904\n", "Epoch 2681/3000, Training Loss: 0.05108607932925224, Test Loss: 0.060483530163764954\n", "Epoch 2682/3000, Training Loss: 0.05107431858778, Test Loss: 0.0604843832552433\n", "Epoch 2683/3000, Training Loss: 0.051062531769275665, Test Loss: 0.06048525497317314\n", "Epoch 2684/3000, Training Loss: 0.051050782203674316, Test Loss: 0.06048601120710373\n", "Epoch 2685/3000, Training Loss: 0.05103905498981476, Test Loss: 0.060486674308776855\n", "Epoch 2686/3000, Training Loss: 0.0510273203253746, Test Loss: 0.06048741191625595\n", "Epoch 2687/3000, Training Loss: 0.05101553723216057, Test Loss: 0.06048821285367012\n", "Epoch 2688/3000, Training Loss: 0.05100378766655922, Test Loss: 0.06048915907740593\n", "Epoch 2689/3000, Training Loss: 0.05099202319979668, Test Loss: 0.06049008667469025\n", "Epoch 2690/3000, Training Loss: 0.05098029971122742, Test Loss: 0.06049104034900665\n", "Epoch 2691/3000, Training Loss: 0.050968561321496964, Test Loss: 0.060492031276226044\n", "Epoch 2692/3000, Training Loss: 0.05095687136054039, Test Loss: 0.06049296632409096\n", "Epoch 2693/3000, Training Loss: 0.050945088267326355, Test Loss: 0.06049388274550438\n", "Epoch 2694/3000, Training Loss: 0.050933413207530975, Test Loss: 0.060494694858789444\n", "Epoch 2695/3000, Training Loss: 0.050921715795993805, Test Loss: 0.06049549579620361\n", "Epoch 2696/3000, Training Loss: 0.050909969955682755, Test Loss: 0.060496240854263306\n", "Epoch 2697/3000, Training Loss: 0.050898294895887375, Test Loss: 0.06049715727567673\n", "Epoch 2698/3000, Training Loss: 0.050886571407318115, Test Loss: 0.06049807742238045\n", "Epoch 2699/3000, Training Loss: 0.050874851644039154, Test Loss: 0.060499031096696854\n", "Epoch 2700/3000, Training Loss: 0.05086321011185646, Test Loss: 0.06050020083785057\n", "Epoch 2701/3000, Training Loss: 0.050851501524448395, Test Loss: 0.06050141528248787\n", "Epoch 2702/3000, Training Loss: 0.05083980783820152, Test Loss: 0.06050245463848114\n", "Epoch 2703/3000, Training Loss: 0.050828106701374054, Test Loss: 0.06050340086221695\n", "Epoch 2704/3000, Training Loss: 0.05081644654273987, Test Loss: 0.060504309833049774\n", "Epoch 2705/3000, Training Loss: 0.0508047491312027, Test Loss: 0.06050503998994827\n", "Epoch 2706/3000, Training Loss: 0.050793107599020004, Test Loss: 0.060505785048007965\n", "Epoch 2707/3000, Training Loss: 0.05078141391277313, Test Loss: 0.06050664931535721\n", "Epoch 2708/3000, Training Loss: 0.05076974630355835, Test Loss: 0.060507502406835556\n", "Epoch 2709/3000, Training Loss: 0.05075811222195625, Test Loss: 0.06050843000411987\n", "Epoch 2710/3000, Training Loss: 0.05074644461274147, Test Loss: 0.060509324073791504\n", "Epoch 2711/3000, Training Loss: 0.050734780728816986, Test Loss: 0.06051032990217209\n", "Epoch 2712/3000, Training Loss: 0.0507231280207634, Test Loss: 0.06051136553287506\n", "Epoch 2713/3000, Training Loss: 0.05071147158741951, Test Loss: 0.06051240116357803\n", "Epoch 2714/3000, Training Loss: 0.0506998635828495, Test Loss: 0.06051349639892578\n", "Epoch 2715/3000, Training Loss: 0.0506882518529892, Test Loss: 0.060514431446790695\n", "Epoch 2716/3000, Training Loss: 0.05067659541964531, Test Loss: 0.06051541492342949\n", "Epoch 2717/3000, Training Loss: 0.05066493898630142, Test Loss: 0.06051648408174515\n", "Epoch 2718/3000, Training Loss: 0.05065328627824783, Test Loss: 0.060517583042383194\n", "Epoch 2719/3000, Training Loss: 0.05064168572425842, Test Loss: 0.06051865965127945\n", "Epoch 2720/3000, Training Loss: 0.050630077719688416, Test Loss: 0.0605197511613369\n", "Epoch 2721/3000, Training Loss: 0.05061846971511841, Test Loss: 0.06052074581384659\n", "Epoch 2722/3000, Training Loss: 0.05060684680938721, Test Loss: 0.06052152067422867\n", "Epoch 2723/3000, Training Loss: 0.0505952313542366, Test Loss: 0.06052221357822418\n", "Epoch 2724/3000, Training Loss: 0.050583623349666595, Test Loss: 0.06052311509847641\n", "Epoch 2725/3000, Training Loss: 0.05057204142212868, Test Loss: 0.06052407622337341\n", "Epoch 2726/3000, Training Loss: 0.050560448318719864, Test Loss: 0.060525182634592056\n", "Epoch 2727/3000, Training Loss: 0.05054887756705284, Test Loss: 0.06052633747458458\n", "Epoch 2728/3000, Training Loss: 0.050537291914224625, Test Loss: 0.06052759662270546\n", "Epoch 2729/3000, Training Loss: 0.05052568018436432, Test Loss: 0.06052872911095619\n", "Epoch 2730/3000, Training Loss: 0.05051412060856819, Test Loss: 0.0605299174785614\n", "Epoch 2731/3000, Training Loss: 0.05050254240632057, Test Loss: 0.060531023889780045\n", "Epoch 2732/3000, Training Loss: 0.05049094930291176, Test Loss: 0.060532040894031525\n", "Epoch 2733/3000, Training Loss: 0.050479378551244736, Test Loss: 0.06053289398550987\n", "Epoch 2734/3000, Training Loss: 0.050467848777770996, Test Loss: 0.060533732175827026\n", "Epoch 2735/3000, Training Loss: 0.050456251949071884, Test Loss: 0.06053461879491806\n", "Epoch 2736/3000, Training Loss: 0.050444722175598145, Test Loss: 0.06053552031517029\n", "Epoch 2737/3000, Training Loss: 0.050433140248060226, Test Loss: 0.06053677201271057\n", "Epoch 2738/3000, Training Loss: 0.050421614199876785, Test Loss: 0.06053818389773369\n", "Epoch 2739/3000, Training Loss: 0.05041003227233887, Test Loss: 0.060539472848176956\n", "Epoch 2740/3000, Training Loss: 0.05039849132299423, Test Loss: 0.06054067611694336\n", "Epoch 2741/3000, Training Loss: 0.05038697272539139, Test Loss: 0.060541603714227676\n", "Epoch 2742/3000, Training Loss: 0.05037543177604675, Test Loss: 0.06054241210222244\n", "Epoch 2743/3000, Training Loss: 0.050363920629024506, Test Loss: 0.060543276369571686\n", "Epoch 2744/3000, Training Loss: 0.05035238340497017, Test Loss: 0.06054433807730675\n", "Epoch 2745/3000, Training Loss: 0.05034084618091583, Test Loss: 0.060545653104782104\n", "Epoch 2746/3000, Training Loss: 0.050329312682151794, Test Loss: 0.06054697558283806\n", "Epoch 2747/3000, Training Loss: 0.05031781643629074, Test Loss: 0.06054819002747536\n", "Epoch 2748/3000, Training Loss: 0.050306301563978195, Test Loss: 0.06054920703172684\n", "Epoch 2749/3000, Training Loss: 0.05029478296637535, Test Loss: 0.06055016070604324\n", "Epoch 2750/3000, Training Loss: 0.05028324946761131, Test Loss: 0.060551248490810394\n", "Epoch 2751/3000, Training Loss: 0.050271760672330856, Test Loss: 0.06055246293544769\n", "Epoch 2752/3000, Training Loss: 0.0502602718770504, Test Loss: 0.06055368483066559\n", "Epoch 2753/3000, Training Loss: 0.050248757004737854, Test Loss: 0.06055476516485214\n", "Epoch 2754/3000, Training Loss: 0.05023731663823128, Test Loss: 0.060555871576070786\n", "Epoch 2755/3000, Training Loss: 0.050225816667079926, Test Loss: 0.060556940734386444\n", "Epoch 2756/3000, Training Loss: 0.05021430924534798, Test Loss: 0.06055795028805733\n", "Epoch 2757/3000, Training Loss: 0.050202805548906326, Test Loss: 0.06055908277630806\n", "Epoch 2758/3000, Training Loss: 0.050191327929496765, Test Loss: 0.0605604350566864\n", "Epoch 2759/3000, Training Loss: 0.05017990246415138, Test Loss: 0.06056186556816101\n", "Epoch 2760/3000, Training Loss: 0.05016838014125824, Test Loss: 0.060563188046216965\n", "Epoch 2761/3000, Training Loss: 0.05015692859888077, Test Loss: 0.06056422367691994\n", "Epoch 2762/3000, Training Loss: 0.050145477056503296, Test Loss: 0.06056512892246246\n", "Epoch 2763/3000, Training Loss: 0.05013404041528702, Test Loss: 0.060566022992134094\n", "Epoch 2764/3000, Training Loss: 0.050122566521167755, Test Loss: 0.060567066073417664\n", "Epoch 2765/3000, Training Loss: 0.05011112242937088, Test Loss: 0.0605684332549572\n", "Epoch 2766/3000, Training Loss: 0.05009961873292923, Test Loss: 0.06056985259056091\n", "Epoch 2767/3000, Training Loss: 0.05008818954229355, Test Loss: 0.060571227222681046\n", "Epoch 2768/3000, Training Loss: 0.05007675662636757, Test Loss: 0.06057250499725342\n", "Epoch 2769/3000, Training Loss: 0.0500653050839901, Test Loss: 0.06057378277182579\n", "Epoch 2770/3000, Training Loss: 0.050053901970386505, Test Loss: 0.06057500094175339\n", "Epoch 2771/3000, Training Loss: 0.050042446702718735, Test Loss: 0.06057604402303696\n", "Epoch 2772/3000, Training Loss: 0.05003102868795395, Test Loss: 0.06057700514793396\n", "Epoch 2773/3000, Training Loss: 0.05001959949731827, Test Loss: 0.06057809293270111\n", "Epoch 2774/3000, Training Loss: 0.050008177757263184, Test Loss: 0.06057919189333916\n", "Epoch 2775/3000, Training Loss: 0.0499967560172081, Test Loss: 0.060580503195524216\n", "Epoch 2776/3000, Training Loss: 0.04998533055186272, Test Loss: 0.06058187410235405\n", "Epoch 2777/3000, Training Loss: 0.04997393488883972, Test Loss: 0.06058325991034508\n", "Epoch 2778/3000, Training Loss: 0.04996255040168762, Test Loss: 0.06058458238840103\n", "Epoch 2779/3000, Training Loss: 0.049951110035181046, Test Loss: 0.06058603152632713\n", "Epoch 2780/3000, Training Loss: 0.04993971064686775, Test Loss: 0.06058720871806145\n", "Epoch 2781/3000, Training Loss: 0.04992830380797386, Test Loss: 0.06058846041560173\n", "Epoch 2782/3000, Training Loss: 0.04991690814495087, Test Loss: 0.06058967113494873\n", "Epoch 2783/3000, Training Loss: 0.04990554228425026, Test Loss: 0.060590654611587524\n", "Epoch 2784/3000, Training Loss: 0.04989413544535637, Test Loss: 0.060591742396354675\n", "Epoch 2785/3000, Training Loss: 0.04988274350762367, Test Loss: 0.06059291958808899\n", "Epoch 2786/3000, Training Loss: 0.04987139254808426, Test Loss: 0.06059430539608002\n", "Epoch 2787/3000, Training Loss: 0.04986000061035156, Test Loss: 0.060595810413360596\n", "Epoch 2788/3000, Training Loss: 0.04984864220023155, Test Loss: 0.06059728562831879\n", "Epoch 2789/3000, Training Loss: 0.04983724653720856, Test Loss: 0.06059873104095459\n", "Epoch 2790/3000, Training Loss: 0.049825895577669144, Test Loss: 0.06060004606842995\n", "Epoch 2791/3000, Training Loss: 0.049814529716968536, Test Loss: 0.06060110032558441\n", "Epoch 2792/3000, Training Loss: 0.04980313777923584, Test Loss: 0.060602132230997086\n", "Epoch 2793/3000, Training Loss: 0.04979182779788971, Test Loss: 0.06060342863202095\n", "Epoch 2794/3000, Training Loss: 0.04978044703602791, Test Loss: 0.06060473248362541\n", "Epoch 2795/3000, Training Loss: 0.04976911470293999, Test Loss: 0.06060612201690674\n", "Epoch 2796/3000, Training Loss: 0.04975775256752968, Test Loss: 0.06060756742954254\n", "Epoch 2797/3000, Training Loss: 0.04974641278386116, Test Loss: 0.060608938336372375\n", "Epoch 2798/3000, Training Loss: 0.04973507300019264, Test Loss: 0.06061028689146042\n", "Epoch 2799/3000, Training Loss: 0.04972374066710472, Test Loss: 0.06061159819364548\n", "Epoch 2800/3000, Training Loss: 0.049712419509887695, Test Loss: 0.060612957924604416\n", "Epoch 2801/3000, Training Loss: 0.049701083451509476, Test Loss: 0.06061416119337082\n", "Epoch 2802/3000, Training Loss: 0.049689747393131256, Test Loss: 0.06061536818742752\n", "Epoch 2803/3000, Training Loss: 0.04967844486236572, Test Loss: 0.06061667203903198\n", "Epoch 2804/3000, Training Loss: 0.049667127430438995, Test Loss: 0.0606180764734745\n", "Epoch 2805/3000, Training Loss: 0.04965582117438316, Test Loss: 0.060619499534368515\n", "Epoch 2806/3000, Training Loss: 0.049644481390714645, Test Loss: 0.06062093749642372\n", "Epoch 2807/3000, Training Loss: 0.04963314160704613, Test Loss: 0.0606224425137043\n", "Epoch 2808/3000, Training Loss: 0.04962184280157089, Test Loss: 0.06062391772866249\n", "Epoch 2809/3000, Training Loss: 0.04961057007312775, Test Loss: 0.0606251135468483\n", "Epoch 2810/3000, Training Loss: 0.04959924519062042, Test Loss: 0.06062637269496918\n", "Epoch 2811/3000, Training Loss: 0.04958793520927429, Test Loss: 0.060627639293670654\n", "Epoch 2812/3000, Training Loss: 0.049576666206121445, Test Loss: 0.06062890961766243\n", "Epoch 2813/3000, Training Loss: 0.049565356224775314, Test Loss: 0.06063038483262062\n", "Epoch 2814/3000, Training Loss: 0.04955407232046127, Test Loss: 0.06063203886151314\n", "Epoch 2815/3000, Training Loss: 0.049542803317308426, Test Loss: 0.06063355132937431\n", "Epoch 2816/3000, Training Loss: 0.04953153803944588, Test Loss: 0.06063486263155937\n", "Epoch 2817/3000, Training Loss: 0.04952022433280945, Test Loss: 0.060636185109615326\n", "Epoch 2818/3000, Training Loss: 0.049508970230817795, Test Loss: 0.0606374554336071\n", "Epoch 2819/3000, Training Loss: 0.04949767142534256, Test Loss: 0.06063874438405037\n", "Epoch 2820/3000, Training Loss: 0.0494864247739315, Test Loss: 0.060640137642621994\n", "Epoch 2821/3000, Training Loss: 0.04947517439723015, Test Loss: 0.060641657561063766\n", "Epoch 2822/3000, Training Loss: 0.0494639053940773, Test Loss: 0.06064317002892494\n", "Epoch 2823/3000, Training Loss: 0.04945266246795654, Test Loss: 0.060644716024398804\n", "Epoch 2824/3000, Training Loss: 0.0494413822889328, Test Loss: 0.06064596772193909\n", "Epoch 2825/3000, Training Loss: 0.049430131912231445, Test Loss: 0.060646895319223404\n", "Epoch 2826/3000, Training Loss: 0.049418870359659195, Test Loss: 0.06064802035689354\n", "Epoch 2827/3000, Training Loss: 0.04940762743353844, Test Loss: 0.06064923480153084\n", "Epoch 2828/3000, Training Loss: 0.049396414309740067, Test Loss: 0.06065075471997261\n", "Epoch 2829/3000, Training Loss: 0.049385152757167816, Test Loss: 0.06065245345234871\n", "Epoch 2830/3000, Training Loss: 0.04937391355633736, Test Loss: 0.06065420061349869\n", "Epoch 2831/3000, Training Loss: 0.04936272278428078, Test Loss: 0.060655802488327026\n", "Epoch 2832/3000, Training Loss: 0.04935145378112793, Test Loss: 0.06065734848380089\n", "Epoch 2833/3000, Training Loss: 0.04934023320674896, Test Loss: 0.06065864488482475\n", "Epoch 2834/3000, Training Loss: 0.049329034984111786, Test Loss: 0.06065978854894638\n", "Epoch 2835/3000, Training Loss: 0.04931780323386192, Test Loss: 0.060661133378744125\n", "Epoch 2836/3000, Training Loss: 0.04930657520890236, Test Loss: 0.06066250428557396\n", "Epoch 2837/3000, Training Loss: 0.04929536208510399, Test Loss: 0.06066402420401573\n", "Epoch 2838/3000, Training Loss: 0.04928414523601532, Test Loss: 0.060665663331747055\n", "Epoch 2839/3000, Training Loss: 0.04927294701337814, Test Loss: 0.060667309910058975\n", "Epoch 2840/3000, Training Loss: 0.04926173388957977, Test Loss: 0.060668881982564926\n", "Epoch 2841/3000, Training Loss: 0.049250535666942596, Test Loss: 0.06067036837339401\n", "Epoch 2842/3000, Training Loss: 0.04923933744430542, Test Loss: 0.060671769082546234\n", "Epoch 2843/3000, Training Loss: 0.04922814294695854, Test Loss: 0.060673102736473083\n", "Epoch 2844/3000, Training Loss: 0.04921698197722435, Test Loss: 0.06067446246743202\n", "Epoch 2845/3000, Training Loss: 0.049205806106328964, Test Loss: 0.06067595258355141\n", "Epoch 2846/3000, Training Loss: 0.04919460043311119, Test Loss: 0.06067754700779915\n", "Epoch 2847/3000, Training Loss: 0.04918339475989342, Test Loss: 0.060680415481328964\n", "Epoch 2848/3000, Training Loss: 0.04917221888899803, Test Loss: 0.06068393215537071\n", "Epoch 2849/3000, Training Loss: 0.04916105791926384, Test Loss: 0.060687072575092316\n", "Epoch 2850/3000, Training Loss: 0.049149904400110245, Test Loss: 0.0606890469789505\n", "Epoch 2851/3000, Training Loss: 0.04913870990276337, Test Loss: 0.060689736157655716\n", "Epoch 2852/3000, Training Loss: 0.049127571284770966, Test Loss: 0.06068964675068855\n", "Epoch 2853/3000, Training Loss: 0.049116410315036774, Test Loss: 0.0606895387172699\n", "Epoch 2854/3000, Training Loss: 0.0491052083671093, Test Loss: 0.06069018691778183\n", "Epoch 2855/3000, Training Loss: 0.04909409210085869, Test Loss: 0.06069178134202957\n", "Epoch 2856/3000, Training Loss: 0.04908293858170509, Test Loss: 0.060694269835948944\n", "Epoch 2857/3000, Training Loss: 0.0490717738866806, Test Loss: 0.06069716066122055\n", "Epoch 2858/3000, Training Loss: 0.04906062036752701, Test Loss: 0.06069965288043022\n", "Epoch 2859/3000, Training Loss: 0.049049511551856995, Test Loss: 0.06070147082209587\n", "Epoch 2860/3000, Training Loss: 0.049038372933864594, Test Loss: 0.0607023686170578\n", "Epoch 2861/3000, Training Loss: 0.04902719706296921, Test Loss: 0.06070265918970108\n", "Epoch 2862/3000, Training Loss: 0.04901610314846039, Test Loss: 0.06070314347743988\n", "Epoch 2863/3000, Training Loss: 0.04900494962930679, Test Loss: 0.06070408970117569\n", "Epoch 2864/3000, Training Loss: 0.04899384081363678, Test Loss: 0.060705751180648804\n", "Epoch 2865/3000, Training Loss: 0.04898269101977348, Test Loss: 0.060707975178956985\n", "Epoch 2866/3000, Training Loss: 0.048971597105264664, Test Loss: 0.06071034073829651\n", "Epoch 2867/3000, Training Loss: 0.048960477113723755, Test Loss: 0.06071233004331589\n", "Epoch 2868/3000, Training Loss: 0.04894934594631195, Test Loss: 0.060713786631822586\n", "Epoch 2869/3000, Training Loss: 0.04893823713064194, Test Loss: 0.0607147216796875\n", "Epoch 2870/3000, Training Loss: 0.048927128314971924, Test Loss: 0.0607154406607151\n", "Epoch 2871/3000, Training Loss: 0.04891600459814072, Test Loss: 0.060716331005096436\n", "Epoch 2872/3000, Training Loss: 0.04890492558479309, Test Loss: 0.0607178770005703\n", "Epoch 2873/3000, Training Loss: 0.04889380931854248, Test Loss: 0.06071992218494415\n", "Epoch 2874/3000, Training Loss: 0.048882704228162766, Test Loss: 0.06072219833731651\n", "Epoch 2875/3000, Training Loss: 0.04887163266539574, Test Loss: 0.06072431430220604\n", "Epoch 2876/3000, Training Loss: 0.04886053875088692, Test Loss: 0.0607261098921299\n", "Epoch 2877/3000, Training Loss: 0.0488494336605072, Test Loss: 0.06072746217250824\n", "Epoch 2878/3000, Training Loss: 0.04883836582303047, Test Loss: 0.06072842329740524\n", "Epoch 2879/3000, Training Loss: 0.04882727190852165, Test Loss: 0.06072945147752762\n", "Epoch 2880/3000, Training Loss: 0.04881621524691582, Test Loss: 0.060730718076229095\n", "Epoch 2881/3000, Training Loss: 0.0488051176071167, Test Loss: 0.06073242053389549\n", "Epoch 2882/3000, Training Loss: 0.04879402369260788, Test Loss: 0.060734450817108154\n", "Epoch 2883/3000, Training Loss: 0.048782967031002045, Test Loss: 0.060736555606126785\n", "Epoch 2884/3000, Training Loss: 0.04877191781997681, Test Loss: 0.06073840335011482\n", "Epoch 2885/3000, Training Loss: 0.04876082390546799, Test Loss: 0.06073983013629913\n", "Epoch 2886/3000, Training Loss: 0.04874980449676514, Test Loss: 0.06074102222919464\n", "Epoch 2887/3000, Training Loss: 0.048738736659288406, Test Loss: 0.0607423335313797\n", "Epoch 2888/3000, Training Loss: 0.04872767627239227, Test Loss: 0.06074392795562744\n", "Epoch 2889/3000, Training Loss: 0.04871658608317375, Test Loss: 0.06074574962258339\n", "Epoch 2890/3000, Training Loss: 0.0487055666744709, Test Loss: 0.06074776500463486\n", "Epoch 2891/3000, Training Loss: 0.048694513738155365, Test Loss: 0.06074976548552513\n", "Epoch 2892/3000, Training Loss: 0.04868347942829132, Test Loss: 0.06075160950422287\n", "Epoch 2893/3000, Training Loss: 0.048672422766685486, Test Loss: 0.06075320392847061\n", "Epoch 2894/3000, Training Loss: 0.04866139963269234, Test Loss: 0.06075465679168701\n", "Epoch 2895/3000, Training Loss: 0.048650357872247696, Test Loss: 0.06075606122612953\n", "Epoch 2896/3000, Training Loss: 0.04863932728767395, Test Loss: 0.06075747311115265\n", "Epoch 2897/3000, Training Loss: 0.048628341406583786, Test Loss: 0.060759156942367554\n", "Epoch 2898/3000, Training Loss: 0.04861726611852646, Test Loss: 0.06076102703809738\n", "Epoch 2899/3000, Training Loss: 0.04860625043511391, Test Loss: 0.060762982815504074\n", "Epoch 2900/3000, Training Loss: 0.04859523847699165, Test Loss: 0.060764797031879425\n", "Epoch 2901/3000, Training Loss: 0.048584192991256714, Test Loss: 0.06076643615961075\n", "Epoch 2902/3000, Training Loss: 0.04857318103313446, Test Loss: 0.06076810136437416\n", "Epoch 2903/3000, Training Loss: 0.04856216907501221, Test Loss: 0.060769811272621155\n", "Epoch 2904/3000, Training Loss: 0.048551179468631744, Test Loss: 0.060771599411964417\n", "Epoch 2905/3000, Training Loss: 0.04854017496109009, Test Loss: 0.06077328324317932\n", "Epoch 2906/3000, Training Loss: 0.048529163002967834, Test Loss: 0.06077517941594124\n", "Epoch 2907/3000, Training Loss: 0.04851815477013588, Test Loss: 0.06077706813812256\n", "Epoch 2908/3000, Training Loss: 0.04850716143846512, Test Loss: 0.060778889805078506\n", "Epoch 2909/3000, Training Loss: 0.04849620163440704, Test Loss: 0.06078064441680908\n", "Epoch 2910/3000, Training Loss: 0.0484851635992527, Test Loss: 0.060782305896282196\n", "Epoch 2911/3000, Training Loss: 0.04847420006990433, Test Loss: 0.06078396365046501\n", "Epoch 2912/3000, Training Loss: 0.048463210463523865, Test Loss: 0.06078571453690529\n", "Epoch 2913/3000, Training Loss: 0.04845220223069191, Test Loss: 0.06078740581870079\n", "Epoch 2914/3000, Training Loss: 0.048441238701343536, Test Loss: 0.06078914552927017\n", "Epoch 2915/3000, Training Loss: 0.04843027517199516, Test Loss: 0.06079090014100075\n", "Epoch 2916/3000, Training Loss: 0.048419252038002014, Test Loss: 0.060792822390794754\n", "Epoch 2917/3000, Training Loss: 0.04840833321213722, Test Loss: 0.06079479306936264\n", "Epoch 2918/3000, Training Loss: 0.048397358506917953, Test Loss: 0.060796789824962616\n", "Epoch 2919/3000, Training Loss: 0.04838637635111809, Test Loss: 0.06079871952533722\n", "Epoch 2920/3000, Training Loss: 0.048375412821769714, Test Loss: 0.060800496488809586\n", "Epoch 2921/3000, Training Loss: 0.048364464193582535, Test Loss: 0.0608021542429924\n", "Epoch 2922/3000, Training Loss: 0.048353467136621475, Test Loss: 0.06080377474427223\n", "Epoch 2923/3000, Training Loss: 0.048342544585466385, Test Loss: 0.06080549210309982\n", "Epoch 2924/3000, Training Loss: 0.048331595957279205, Test Loss: 0.060807183384895325\n", "Epoch 2925/3000, Training Loss: 0.048320673406124115, Test Loss: 0.060809046030044556\n", "Epoch 2926/3000, Training Loss: 0.04830971732735634, Test Loss: 0.0608111247420311\n", "Epoch 2927/3000, Training Loss: 0.048298776149749756, Test Loss: 0.06081331893801689\n", "Epoch 2928/3000, Training Loss: 0.048287827521562576, Test Loss: 0.06081536039710045\n", "Epoch 2929/3000, Training Loss: 0.0482768639922142, Test Loss: 0.0608171783387661\n", "Epoch 2930/3000, Training Loss: 0.04826592653989792, Test Loss: 0.060818932950496674\n", "Epoch 2931/3000, Training Loss: 0.04825499281287193, Test Loss: 0.060820452868938446\n", "Epoch 2932/3000, Training Loss: 0.048244062811136246, Test Loss: 0.060822129249572754\n", "Epoch 2933/3000, Training Loss: 0.04823315143585205, Test Loss: 0.06082403287291527\n", "Epoch 2934/3000, Training Loss: 0.048222217708826065, Test Loss: 0.06082600727677345\n", "Epoch 2935/3000, Training Loss: 0.04821131378412247, Test Loss: 0.06082805618643761\n", "Epoch 2936/3000, Training Loss: 0.048200394958257675, Test Loss: 0.06083008274435997\n", "Epoch 2937/3000, Training Loss: 0.04818946123123169, Test Loss: 0.06083211302757263\n", "Epoch 2938/3000, Training Loss: 0.04817856103181839, Test Loss: 0.06083403900265694\n", "Epoch 2939/3000, Training Loss: 0.048167675733566284, Test Loss: 0.06083587557077408\n", "Epoch 2940/3000, Training Loss: 0.04815673828125, Test Loss: 0.06083765625953674\n", "Epoch 2941/3000, Training Loss: 0.048145804554224014, Test Loss: 0.06083925813436508\n", "Epoch 2942/3000, Training Loss: 0.048134904354810715, Test Loss: 0.0608411468565464\n", "Epoch 2943/3000, Training Loss: 0.048124030232429504, Test Loss: 0.060843244194984436\n", "Epoch 2944/3000, Training Loss: 0.048113107681274414, Test Loss: 0.060845475643873215\n", "Epoch 2945/3000, Training Loss: 0.0481022484600544, Test Loss: 0.060847487300634384\n", "Epoch 2946/3000, Training Loss: 0.048091329634189606, Test Loss: 0.06084931641817093\n", "Epoch 2947/3000, Training Loss: 0.04808047041296959, Test Loss: 0.06085117906332016\n", "Epoch 2948/3000, Training Loss: 0.04806956648826599, Test Loss: 0.06085303798317909\n", "Epoch 2949/3000, Training Loss: 0.04805868864059448, Test Loss: 0.060854919254779816\n", "Epoch 2950/3000, Training Loss: 0.04804776608943939, Test Loss: 0.06085686385631561\n", "Epoch 2951/3000, Training Loss: 0.04803689941763878, Test Loss: 0.060858771204948425\n", "Epoch 2952/3000, Training Loss: 0.04802604019641876, Test Loss: 0.06086084991693497\n", "Epoch 2953/3000, Training Loss: 0.04801515117287636, Test Loss: 0.06086302921175957\n", "Epoch 2954/3000, Training Loss: 0.048004306852817535, Test Loss: 0.06086507812142372\n", "Epoch 2955/3000, Training Loss: 0.04799341782927513, Test Loss: 0.06086699292063713\n", "Epoch 2956/3000, Training Loss: 0.047982558608055115, Test Loss: 0.06086879223585129\n", "Epoch 2957/3000, Training Loss: 0.047971706837415695, Test Loss: 0.06087062135338783\n", "Epoch 2958/3000, Training Loss: 0.0479608029127121, Test Loss: 0.06087258830666542\n", "Epoch 2959/3000, Training Loss: 0.047949984669685364, Test Loss: 0.06087471544742584\n", "Epoch 2960/3000, Training Loss: 0.04793912172317505, Test Loss: 0.06087697297334671\n", "Epoch 2961/3000, Training Loss: 0.04792824760079384, Test Loss: 0.060879141092300415\n", "Epoch 2962/3000, Training Loss: 0.04791739955544472, Test Loss: 0.06088107451796532\n", "Epoch 2963/3000, Training Loss: 0.04790657013654709, Test Loss: 0.06088286265730858\n", "Epoch 2964/3000, Training Loss: 0.04789573699235916, Test Loss: 0.060884565114974976\n", "Epoch 2965/3000, Training Loss: 0.04788488522171974, Test Loss: 0.06088651716709137\n", "Epoch 2966/3000, Training Loss: 0.04787406325340271, Test Loss: 0.060888756066560745\n", "Epoch 2967/3000, Training Loss: 0.04786321148276329, Test Loss: 0.06089109554886818\n", "Epoch 2968/3000, Training Loss: 0.047852352261543274, Test Loss: 0.06089335307478905\n", "Epoch 2969/3000, Training Loss: 0.04784156009554863, Test Loss: 0.06089526042342186\n", "Epoch 2970/3000, Training Loss: 0.047830723226070404, Test Loss: 0.060897096991539\n", "Epoch 2971/3000, Training Loss: 0.04781989008188248, Test Loss: 0.060898952186107635\n", "Epoch 2972/3000, Training Loss: 0.047809064388275146, Test Loss: 0.06090099364519119\n", "Epoch 2973/3000, Training Loss: 0.047798242419958115, Test Loss: 0.06090322509407997\n", "Epoch 2974/3000, Training Loss: 0.04778743162751198, Test Loss: 0.06090543791651726\n", "Epoch 2975/3000, Training Loss: 0.04777663201093674, Test Loss: 0.060907527804374695\n", "Epoch 2976/3000, Training Loss: 0.04776579141616821, Test Loss: 0.060909442603588104\n", "Epoch 2977/3000, Training Loss: 0.04775499179959297, Test Loss: 0.06091148406267166\n", "Epoch 2978/3000, Training Loss: 0.04774417728185654, Test Loss: 0.060913652181625366\n", "Epoch 2979/3000, Training Loss: 0.0477333702147007, Test Loss: 0.06091571971774101\n", "Epoch 2980/3000, Training Loss: 0.047722574323415756, Test Loss: 0.060917798429727554\n", "Epoch 2981/3000, Training Loss: 0.047711774706840515, Test Loss: 0.060920022428035736\n", "Epoch 2982/3000, Training Loss: 0.04770098254084587, Test Loss: 0.06092216446995735\n", "Epoch 2983/3000, Training Loss: 0.047690197825431824, Test Loss: 0.06092426925897598\n", "Epoch 2984/3000, Training Loss: 0.04767940193414688, Test Loss: 0.06092628464102745\n", "Epoch 2985/3000, Training Loss: 0.047668613493442535, Test Loss: 0.060928262770175934\n", "Epoch 2986/3000, Training Loss: 0.0476578064262867, Test Loss: 0.06093037873506546\n", "Epoch 2987/3000, Training Loss: 0.04764704033732414, Test Loss: 0.06093268096446991\n", "Epoch 2988/3000, Training Loss: 0.0476362518966198, Test Loss: 0.06093496456742287\n", "Epoch 2989/3000, Training Loss: 0.04762548580765724, Test Loss: 0.060937099158763885\n", "Epoch 2990/3000, Training Loss: 0.047614701092243195, Test Loss: 0.06093911454081535\n", "Epoch 2991/3000, Training Loss: 0.04760391637682915, Test Loss: 0.06094112992286682\n", "Epoch 2992/3000, Training Loss: 0.04759315028786659, Test Loss: 0.06094307079911232\n", "Epoch 2993/3000, Training Loss: 0.047582391649484634, Test Loss: 0.0609453059732914\n", "Epoch 2994/3000, Training Loss: 0.047571610659360886, Test Loss: 0.060947731137275696\n", "Epoch 2995/3000, Training Loss: 0.047560855746269226, Test Loss: 0.06095010042190552\n", "Epoch 2996/3000, Training Loss: 0.047550078481435776, Test Loss: 0.06095227599143982\n", "Epoch 2997/3000, Training Loss: 0.04753931611776352, Test Loss: 0.06095428392291069\n", "Epoch 2998/3000, Training Loss: 0.04752859100699425, Test Loss: 0.06095632538199425\n", "Epoch 2999/3000, Training Loss: 0.04751782864332199, Test Loss: 0.06095853075385094\n", "Epoch 3000/3000, Training Loss: 0.04750705510377884, Test Loss: 0.06096077710390091\n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAHHCAYAAABDUnkqAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAABxQElEQVR4nO3dd3gU5d7G8e/uJtn0RkIKBELvXYGAiB6jAZUjiorKEbBxRNSjiAULYMUCliOKHdSjYnkRUUEpggVRuoKEXkJJAgTSQ8ruvH9ssrIkhBCSbMr9ua65dnbmmdnfzm7Ym5lnZkyGYRiIiIiI1BNmdxcgIiIiUpUUbkRERKReUbgRERGRekXhRkREROoVhRsRERGpVxRuREREpF5RuBEREZF6ReFGRERE6hWFGxEREalXFG6kzhs9ejSxsbGVWnbKlCmYTKaqLaiW2bNnDyaTidmzZ7u7lNOaPXs2JpOJPXv2uLsUaUBKvndr1qxxdylSRRRupNqYTKYKDcuXL3d3qQ1ebGxshT6rqgpIzzzzDPPmzauSdVWVkqB75MgRd5dSId988w2DBg2iUaNGeHt707ZtWyZMmEBaWpq7SyulJDycavjtt9/cXaLUMx7uLkDqrw8//NDl+QcffMDixYtLTe/QocNZvc7bb7+N3W6v1LKPPvooDz300Fm9fn3w8ssvk52d7Xy+YMECPvnkE1566SXCwsKc0/v161clr/fMM89w9dVXM3ToUJfpN954I9dddx1Wq7VKXqe+mjBhAtOnT6dbt248+OCDhIaGsm7dOmbMmMGcOXNYunQp7dq1c3eZpTzxxBO0aNGi1PTWrVu7oRqpzxRupNr861//cnn+22+/sXjx4lLTT5abm4uvr2+FX8fT07NS9QF4eHjg4aE/g5NDRkpKCp988glDhw6t9CG/yrBYLFgslhp7vbrok08+Yfr06QwfPpyPPvrIZXuNHj2aCy+8kGuuuYZ169bV6Hc7JycHPz+/ctsMHjyYc845p4YqkoZMh6XErS644AI6d+7M2rVrOf/88/H19eXhhx8G4KuvvuKyyy4jOjoaq9VKq1atePLJJ7HZbC7rOLnPTUkfk2nTpvHWW2/RqlUrrFYr5557LqtXr3ZZtqw+NyaTiTvvvJN58+bRuXNnrFYrnTp14rvvvitV//LlyznnnHPw9vamVatWvPnmmxXux/Pzzz9zzTXX0KxZM6xWKzExMdx7773k5eWVen/+/v4cOHCAoUOH4u/vT3h4OBMmTCi1LdLT0xk9ejRBQUEEBwczatQo0tPTT1tLRf3vf/+jV69e+Pj4EBoaynXXXce+fftc2mzfvp1hw4YRGRmJt7c3TZs25brrriMjIwNwbN+cnBzef/9952GJ0aNHA2X3uYmNjeXyyy/nl19+oXfv3nh7e9OyZUs++OCDUvX9+eefDBw4EB8fH5o2bcpTTz3FrFmzqrQfzw8//MCAAQPw8/MjODiYK664gsTERJc2WVlZ3HPPPcTGxmK1WmncuDEXX3wx69atq/B2OpXHH3+ckJAQ3nrrrVJBsHfv3jz44INs3LiRL774AoA777wTf39/cnNzS63r+uuvJzIy0uV7tHDhQuf7CwgI4LLLLuOvv/5yWa7kO7lz504uvfRSAgICGDFiRMU2YDlO/Nt96aWXaN68OT4+PgwcOJBNmzaVal+RzwLgwIED3HLLLc5/S1q0aMHYsWMpKChwaZefn8/48eMJDw/Hz8+PK6+8ksOHD7u0WbNmDQkJCYSFheHj40OLFi24+eabz/q9S9XSf1nF7dLS0hg8eDDXXXcd//rXv4iIiAAcP3T+/v6MHz8ef39/fvjhByZNmkRmZiYvvPDCadf78ccfk5WVxb///W9MJhPPP/88V111Fbt27Trt3p5ffvmFuXPncscddxAQEMB///tfhg0bRlJSEo0aNQJg/fr1DBo0iKioKB5//HFsNhtPPPEE4eHhFXrfn3/+Obm5uYwdO5ZGjRqxatUqXn31Vfbv38/nn3/u0tZms5GQkECfPn2YNm0aS5YsYfr06bRq1YqxY8cCYBgGV1xxBb/88gu33347HTp04Msvv2TUqFEVqud0nn76aR577DGuvfZabr31Vg4fPsyrr77K+eefz/r16wkODqagoICEhATy8/O56667iIyM5MCBA3zzzTekp6cTFBTEhx9+yK233krv3r0ZM2YMAK1atSr3tXfs2MHVV1/NLbfcwqhRo3jvvfcYPXo0vXr1olOnToDjB+zCCy/EZDIxceJE/Pz8eOedd6r0ENeSJUsYPHgwLVu2ZMqUKeTl5fHqq6/Sv39/1q1b5wzZt99+O1988QV33nknHTt2JC0tjV9++YXExER69uxZoe1Ulu3bt7N161ZGjx5NYGBgmW1GjhzJ5MmT+eabb7juuusYPnw4r732Gt9++y3XXHONs11ubi5ff/01o0ePdoakDz/8kFGjRpGQkMBzzz1Hbm4uM2fO5LzzzmP9+vUu/4koKioiISGB8847j2nTplVob2tGRkapPk0mk8n5N1Xigw8+ICsri3HjxnH8+HFeeeUV/vGPf7Bx40bnvw8V/SwOHjxI7969SU9PZ8yYMbRv354DBw7wxRdfkJubi5eXl/N177rrLkJCQpg8eTJ79uzh5Zdf5s477+TTTz8F4NChQ1xyySWEh4fz0EMPERwczJ49e5g7d+5p37vUMEOkhowbN844+Ss3cOBAAzDeeOONUu1zc3NLTfv3v/9t+Pr6GsePH3dOGzVqlNG8eXPn8927dxuA0ahRI+Po0aPO6V999ZUBGF9//bVz2uTJk0vVBBheXl7Gjh07nNP++OMPAzBeffVV57QhQ4YYvr6+xoEDB5zTtm/fbnh4eJRaZ1nKen9Tp041TCaTsXfvXpf3BxhPPPGES9sePXoYvXr1cj6fN2+eARjPP/+8c1pRUZExYMAAAzBmzZp12ppKvPDCCwZg7N692zAMw9izZ49hsViMp59+2qXdxo0bDQ8PD+f09evXG4Dx+eefl7t+Pz8/Y9SoUaWmz5o1y+V1DcMwmjdvbgDGTz/95Jx26NAhw2q1Gvfdd59z2l133WWYTCZj/fr1zmlpaWlGaGhoqXWWpeS7cPjw4VO26d69u9G4cWMjLS3NOe2PP/4wzGazMXLkSOe0oKAgY9y4cadcT0W308lKPuOXXnqp3HaBgYFGz549DcMwDLvdbjRp0sQYNmyYS5vPPvvMZbtmZWUZwcHBxm233ebSLiUlxQgKCnKZXvKdfOihhypUd8nnWtZgtVqd7Ur+dn18fIz9+/c7p//+++8GYNx7773OaRX9LEaOHGmYzWZj9erVpeqy2+0u9cXHxzunGYZh3HvvvYbFYjHS09MNwzCML7/80gDKXJfULjosJW5ntVq56aabSk338fFxjmdlZXHkyBEGDBhAbm4uW7ZsOe16hw8fTkhIiPP5gAEDANi1a9dpl42Pj3fZm9C1a1cCAwOdy9psNpYsWcLQoUOJjo52tmvdujWDBw8+7frB9f3l5ORw5MgR+vXrh2EYrF+/vlT722+/3eX5gAEDXN7LggUL8PDwcO7JAUcflrvuuqtC9ZRn7ty52O12rr32Wo4cOeIcIiMjadOmDcuWLQNw7nH4/vvvyzwMUlkdO3Z0fn4A4eHhtGvXzuX9f/fdd8TFxdG9e3fntNDQ0Co5XAKQnJzMhg0bGD16NKGhoc7pXbt25eKLL2bBggXOacHBwfz+++8cPHiwzHVVdjtlZWUBEBAQUG67gIAAMjMzAceekWuuuYYFCxa4dBr/9NNPadKkCeeddx4AixcvJj09neuvv97lM7ZYLPTp08f5GZ/oxO9aRbz22mssXrzYZVi4cGGpdkOHDqVJkybO571796ZPnz7ObVzRz8JutzNv3jyGDBlSZl+fkw8fjxkzxmXagAEDsNls7N27F3B8ruA4U62wsPCM3rvULIUbcbsmTZq47Bou8ddff3HllVcSFBREYGAg4eHhzs7Ip+uXANCsWTOX5yVB59ixY2e8bMnyJcseOnSIvLy8Ms/yqOiZH0lJSc5/nEv60QwcOBAo/f68vb1LHe46sR6AvXv3EhUVhb+/v0u7qjhrZvv27RiGQZs2bQgPD3cZEhMTOXToEAAtWrRg/PjxvPPOO4SFhZGQkMBrr71Woc+rPKf7PMDx/s/m8zidkh+4srZnhw4dOHLkCDk5OQA8//zzbNq0iZiYGHr37s2UKVNcglhlt1NJqCkJOaeSlZXlEoCGDx9OXl4e8+fPByA7O5sFCxZwzTXXOH/Mt2/fDsA//vGPUp/xokWLnJ9xCQ8PD5o2bVpuHSfr3bs38fHxLsOFF15Yql2bNm1KTWvbtq2z31RFP4vDhw+TmZlJ586dK1Tf6f7NGDhwIMOGDePxxx8nLCyMK664glmzZpGfn1+h9UvNUZ8bcbsT92CUSE9PZ+DAgQQGBvLEE0/QqlUrvL29WbduHQ8++GCFTv0+1Vk3hmFU67IVYbPZuPjiizl69CgPPvgg7du3x8/PjwMHDjB69OhS78/dZxDZ7XZMJhMLFy4ss5YTA9X06dMZPXo0X331FYsWLeLuu+9m6tSp/Pbbb2f8Y1iiuj+PqnbttdcyYMAAvvzySxYtWsQLL7zAc889x9y5c5179iqznUoum/Dnn3+e8rX37t1LZmYmHTt2dE7r27cvsbGxfPbZZ9xwww18/fXX5OXlMXz4cGebku/chx9+SGRkZKn1nnzmldVqxWyuX/8/Pt33zGQy8cUXX/Dbb7/x9ddf8/3333PzzTczffp0fvvtt1L/sRD3UbiRWmn58uWkpaUxd+5czj//fOf03bt3u7GqvzVu3Bhvb2927NhRal5Z0062ceNGtm3bxvvvv8/IkSOd0xcvXlzpmpo3b87SpUvJzs52+Ud269atlV5niVatWmEYBi1atKBt27anbd+lSxe6dOnCo48+yq+//kr//v154403eOqpp4DShwOqQvPmzSv9eVR0/VD29tyyZQthYWEup0JHRUVxxx13cMcdd3Do0CF69uzJ008/7XLY8nTb6WRt27albdu2zJs3j1deeaXMw1MlZ5FdfvnlLtOvvfZaXnnlFTIzM/n000+JjY2lb9++zvklh2EbN25MfHx8RTdLtSjZi3Sibdu2OTsJV/Sz8PHxITAwsMwzrc5G37596du3L08//TQff/wxI0aMYM6cOdx6661V+jpSefUrdku9UfI/qBP/Z15QUMDrr7/urpJcWCwW4uPjmTdvnku/ih07dpTZh6Cs5cH1/RmGwSuvvFLpmi699FKKioqYOXOmc5rNZuPVV1+t9DpLXHXVVVgsFh5//PFSe0sMw3BeFTczM5OioiKX+V26dMFsNrvsuvfz86vSU9QBEhISWLlyJRs2bHBOO3r0KB999FGVrD8qKoru3bvz/vvvu9S+adMmFi1axKWXXgo4tvnJh5caN25MdHS0cxtUdDuVZdKkSRw7dozbb7+91KUA1q5dy3PPPUfnzp0ZNmyYy7zhw4eTn5/P+++/z3fffce1117rMj8hIYHAwECeeeaZMvuTnHxKdHWaN28eBw4ccD5ftWoVv//+uzMYVvSzMJvNDB06lK+//rrMWyuc6Z6/Y8eOlVqmpI+XDk3VLtpzI7VSv379CAkJYdSoUdx9992YTCY+/PDDWnUYYsqUKSxatIj+/fszduxYbDYbM2bMoHPnzi4/sGVp3749rVq1YsKECRw4cIDAwED+7//+r0L9gU5lyJAh9O/fn4ceeog9e/bQsWNH5s6de9b9XcDxv/qnnnqKiRMnsmfPHoYOHUpAQAC7d+/myy+/ZMyYMUyYMIEffviBO++8k2uuuYa2bdtSVFTEhx9+iMVicfmx7dWrF0uWLOHFF18kOjqaFi1a0KdPn7Oq8YEHHuB///sfF198MXfddZfzVPBmzZpx9OjRCu8tevHFF0ud1mw2m3n44Yd54YUXGDx4MHFxcdxyyy3O04+DgoKYMmUK4Ojv0rRpU66++mq6deuGv78/S5YsYfXq1UyfPh2gwtupLCNGjGD16tW88sorbN68mREjRhASEsK6det47733aNSoEV988UWpyx307NmT1q1b88gjj5Cfn+9ySAogMDCQmTNncuONN9KzZ0+uu+46wsPDSUpK4ttvv6V///7MmDGjQtvwVBYuXFjmyQD9+vWjZcuWzuetW7fmvPPOY+zYseTn5/Pyyy/TqFEjHnjgAWebinwW4Lga9qJFixg4cCBjxoyhQ4cOJCcn8/nnn/PLL784OwlXxPvvv8/rr7/OlVdeSatWrcjKyuLtt98mMDDQGaiklnDDGVrSQJ3qVPBOnTqV2X7FihVG3759DR8fHyM6Otp44IEHjO+//94AjGXLljnbnepU8BdeeKHUOgFj8uTJzuenOhW8rNN4mzdvXur05aVLlxo9evQwvLy8jFatWhnvvPOOcd999xne3t6n2Ap/27x5sxEfH2/4+/sbYWFhxm233eY85fzE07ZHjRpl+Pn5lVq+rNrT0tKMG2+80QgMDDSCgoKMG2+80Xna8dmcCl7i//7v/4zzzjvP8PPzM/z8/Iz27dsb48aNM7Zu3WoYhmHs2rXLuPnmm41WrVoZ3t7eRmhoqHHhhRcaS5YscVnPli1bjPPPP9/w8fExAOd2PdWp4JdddlmpGgcOHGgMHDjQZdr69euNAQMGGFar1WjatKkxdepU47///a8BGCkpKeW+55LtWdZgsVic7ZYsWWL079/f8PHxMQIDA40hQ4YYmzdvds7Pz8837r//fqNbt25GQECA4efnZ3Tr1s14/fXXnW0qup3KM2/ePOPiiy82QkJCDKvVarRu3dq47777yj2V/ZFHHjEAo3Xr1qdss2zZMiMhIcEICgoyvL29jVatWhmjR4821qxZ42xzqu/kqZR3KviJ380T/3anT59uxMTEGFar1RgwYIDxxx9/lFrv6T6LEnv37jVGjhxphIeHG1ar1WjZsqUxbtw4Iz8/36W+k0/xXrZsmcu/N+vWrTOuv/56o1mzZobVajUaN25sXH755S7bRmoHk2HUov8Ki9QDQ4cO5a+//iqz34DUvHvuuYc333yT7Oxst3fMlvLt2bOHFi1a8MILLzBhwgR3lyN1mPrciJyFk2+VsH37dhYsWMAFF1zgnoIauJM/j7S0ND788EPOO+88BRuRBkR9bkTOQsuWLRk9ejQtW7Zk7969zJw5Ey8vL5e+AVJz4uLiuOCCC+jQoQOpqam8++67ZGZm8thjj7m7NBGpQQo3Imdh0KBBfPLJJ6SkpGC1WomLi+OZZ54p8yJkUv0uvfRSvvjiC9566y1MJhM9e/bk3XffdbmcgIjUf+pzIyIiIvWK+tyIiIhIvaJwIyIiIvVKg+tzY7fbOXjwIAEBAdVyCXgRERGpeoZhkJWVRXR09Gnva9bgws3BgweJiYlxdxkiIiJSCfv27TvtTXgbXLgpudHcvn37CAwMdHM1IiIiUhGZmZnExMSUecPYkzW4cFNyKCowMFDhRkREpI6pSJcSdSgWERGRekXhRkREROoVhRsRERGpVxpcnxsREXEPu91OQUGBu8uQWszLy+u0p3lXhMKNiIhUu4KCAnbv3o3dbnd3KVKLmc1mWrRogZeX11mtR+FGRESqlWEYJCcnY7FYiImJqZL/mUv9U3KR3eTkZJo1a3ZWF9pVuBERkWpVVFREbm4u0dHR+Pr6urscqcXCw8M5ePAgRUVFeHp6Vno9is8iIlKtbDYbwFkfapD6r+Q7UvKdqSyFGxERqRG6n5+cTlV9RxRuREREpF5RuBEREakhsbGxvPzyyxVuv3z5ckwmE+np6dVWU32kcCMiInISk8lU7jBlypRKrXf16tWMGTOmwu379etHcnIyQUFBlXq9iqpvIUpnS1WVogLIOQz2Ighp7u5qRETkLCQnJzvHP/30UyZNmsTWrVud0/z9/Z3jhmFgs9nw8Dj9T2p4ePgZ1eHl5UVkZOQZLSPac1N19q+GlzrC/65ydyUiInKWIiMjnUNQUBAmk8n5fMuWLQQEBLBw4UJ69eqF1Wrll19+YefOnVxxxRVERETg7+/Pueeey5IlS1zWe/JhKZPJxDvvvMOVV16Jr68vbdq0Yf78+c75J+9RmT17NsHBwXz//fd06NABf39/Bg0a5BLGioqKuPvuuwkODqZRo0Y8+OCDjBo1iqFDh1Z6exw7doyRI0cSEhKCr68vgwcPZvv27c75e/fuZciQIYSEhODn50enTp1YsGCBc9kRI0YQHh6Oj48Pbdq0YdasWZWupSIUbqqKlx8A9vxsNxciIlK7GYZBbkGRWwbDMKrsfTz00EM8++yzJCYm0rVrV7Kzs7n00ktZunQp69evZ9CgQQwZMoSkpKRy1/P4449z7bXX8ueff3LppZcyYsQIjh49esr2ubm5TJs2jQ8//JCffvqJpKQkJkyY4Jz/3HPP8dFHHzFr1ixWrFhBZmYm8+bNO6v3Onr0aNasWcP8+fNZuXIlhmFw6aWXUlhYCMC4cePIz8/np59+YuPGjTz33HPOvVuPPfYYmzdvZuHChSQmJjJz5kzCwsLOqp7T0WGpKrLhUBHdgdzsTPxP11hEpAHLK7TRcdL3bnntzU8k4OtVNT99TzzxBBdffLHzeWhoKN26dXM+f/LJJ/nyyy+ZP38+d9555ynXM3r0aK6//noAnnnmGf773/+yatUqBg0aVGb7wsJC3njjDVq1agXAnXfeyRNPPOGc/+qrrzJx4kSuvPJKAGbMmOHci1IZ27dvZ/78+axYsYJ+/foB8NFHHxETE8O8efO45pprSEpKYtiwYXTp0gWAli1bOpdPSkqiR48enHPOOYBj71V1056bKuLtFwiAD8ehCv9nICIitVPJj3WJ7OxsJkyYQIcOHQgODsbf35/ExMTT7rnp2rWrc9zPz4/AwEAOHTp0yva+vr7OYAMQFRXlbJ+RkUFqaiq9e/d2zrdYLPTq1euM3tuJEhMT8fDwoE+fPs5pjRo1ol27diQmJgJw991389RTT9G/f38mT57Mn3/+6Ww7duxY5syZQ/fu3XnggQf49ddfK11LRWnPTRXx9nP0ZLdgh6Lj4Onj5opERGonH08Lm59IcNtrVxU/Pz+X5xMmTGDx4sVMmzaN1q1b4+Pjw9VXX33aO6GffJsBk8lU7g1Gy2pflYfbKuPWW28lISGBb7/9lkWLFjF16lSmT5/OXXfdxeDBg9m7dy8LFixg8eLFXHTRRYwbN45p06ZVWz3ac1NFfP1O6DmvfjciIqdkMpnw9fJwy1CdV0lesWIFo0eP5sorr6RLly5ERkayZ8+eanu9sgQFBREREcHq1aud02w2G+vWrav0Ojt06EBRURG///67c1paWhpbt26lY8eOzmkxMTHcfvvtzJ07l/vuu4+3337bOS88PJxRo0bxv//9j5dffpm33nqr0vVUhPbcVBF/Hyu5hhVfUz55OZn4+p/Z6X4iIlK3tWnThrlz5zJkyBBMJhOPPfZYuXtgqstdd93F1KlTad26Ne3bt+fVV1/l2LFjFQp2GzduJCAgwPncZDLRrVs3rrjiCm677TbefPNNAgICeOihh2jSpAlXXHEFAPfccw+DBw+mbdu2HDt2jGXLltGhQwcAJk2aRK9evejUqRP5+fl88803znnVReGmivh4WkjDii/55GVn4Bvh7opERKQmvfjii9x8883069ePsLAwHnzwQTIzM2u8jgcffJCUlBRGjhyJxWJhzJgxJCQkYLGc/pDc+eef7/LcYrFQVFTErFmz+M9//sPll19OQUEB559/PgsWLHAeIrPZbIwbN479+/cTGBjIoEGDeOmllwDHtXomTpzInj178PHxYcCAAcyZM6fq3/gJTIa7D9TVsMzMTIKCgsjIyCAwMLBK171vSltiSOXAsK9o0uWCKl23iEhddfz4cXbv3k2LFi3w9vZ2dzkNjt1up0OHDlx77bU8+eST7i6nXOV9V87k91t7bqrQcZM3GFCQk+XuUkREpIHau3cvixYtYuDAgeTn5zNjxgx2797NDTfc4O7Saow6FFehfLPjDKn8vJrfDSkiIgJgNpuZPXs25557Lv3792fjxo0sWbKk2vu51Cbac1OFCs2+YIOiPJ0tJSIi7hETE8OKFSvcXYZbac9NFSr0cOy5KTqucCMiIuIuCjdVyGbxdTzmq8+NiIiIuyjcVCGbp+NCfrqIn4iIiPso3FQhw7P4Utz5Oe4tREREpAFza7j56aefGDJkCNHR0ZhMptPekn3u3LlcfPHFhIeHExgYSFxcHN9/7547y5bF8HKEG1Oh9tyIiIi4i1vDTU5ODt26deO1116rUPuffvqJiy++mAULFrB27VouvPBChgwZwvr166u50ooxWR2XrLYUKNyIiIi4i1tPBR88eDCDBw+ucPuXX37Z5fkzzzzDV199xddff02PHj2quLpK8AkGwKtI17kREZGKmzJlCvPmzWPDhg3uLqVeqNN9bux2O1lZWYSGhrq7FADMxeHGWqQ9NyIidZnJZCp3mDJlylmt++RuGBMmTGDp0qVnV3QFTJkyhe7du1f767hbnb6I37Rp08jOzubaa689ZZv8/Hzy8/Odz6vzJmYeviEAeNt0KriISF2WnJzsHP/000+ZNGkSW7dudU7z9/ev0tfz9/ev8nU2ZHV2z83HH3/M448/zmeffUbjxo1P2W7q1KkEBQU5h5iYmGqryTvAEW587dpzIyJSl0VGRjqHoKAgTCaTy7Q5c+bQoUMHvL29ad++Pa+//rpz2YKCAu68806ioqLw9vamefPmTJ06FYDY2FgArrzySkwmk/P5yXtURo8ezdChQ5k2bRpRUVE0atSIcePGUVhY6GyTnJzMZZddho+PDy1atODjjz8mNja2VBeOM7Fx40b+8Y9/4OPjQ6NGjRgzZgzZ2X//pi1fvpzevXvj5+dHcHAw/fv3Z+/evQD88ccfXHjhhQQEBBAYGEivXr1Ys2ZNpWs5G3Vyz82cOXO49dZb+fzzz4mPjy+37cSJExk/frzzeWZmZrUFHJ/AMAD8jRwwDDCZquV1RETqNMOAwlz3vLan71n/2/zRRx8xadIkZsyYQY8ePVi/fj233XYbfn5+jBo1iv/+97/Mnz+fzz77jGbNmrFv3z727dsHwOrVq2ncuDGzZs1i0KBBWCyWU77OsmXLiIqKYtmyZezYsYPhw4fTvXt3brvtNgBGjhzJkSNHWL58OZ6enowfP55Dhw5V+n3l5OSQkJBAXFwcq1ev5tChQ9x6663ceeedzJ49m6KiIoYOHcptt93GJ598QkFBAatWrcJUvD1HjBhBjx49mDlzJhaLhQ0bNuDp6Vnpes5GnQs3n3zyCTfffDNz5szhsssuO217q9WK1WqtgcogILgRAJ4UYRTmYio+NVxERE5QmAvPRLvntR8+CGf5b/PkyZOZPn06V111FQAtWrRg8+bNvPnmm4waNYqkpCTatGnDeeedh8lkonnz5s5lw8PDAQgODiYyMrLc1wkJCWHGjBlYLBbat2/PZZddxtKlS7ntttvYsmULS5YsYfXq1ZxzzjkAvPPOO7Rp06bS7+vjjz/m+PHjfPDBB/j5ObbRjBkzGDJkCM899xyenp5kZGRw+eWX06pVKwCXm3EmJSVx//330759e4CzquVsufWwVHZ2Nhs2bHD2Dt+9ezcbNmwgKSkJcOx1GTlypLP9xx9/zMiRI5k+fTp9+vQhJSWFlJQUMjIy3FF+KYGBQRQZjk2al5nm5mpERKSq5eTksHPnTm655RZnPxl/f3+eeuopdu7cCTgOKW3YsIF27dpx9913s2jRokq9VqdOnVz27ERFRTn3zGzduhUPDw969uzpnN+6dWtCQkIq/d4SExPp1q2bM9gA9O/fH7vdztatWwkNDWX06NEkJCQwZMgQXnnlFZe+SePHj+fWW28lPj6eZ5991rk93MGte27WrFnDhRde6Hxecvho1KhRzJ49m+TkZGfQAXjrrbcoKipi3LhxjBs3zjm9pL27+Vo9OIYvoWSTnZGGb1gzd5ckIlL7ePo69qC467XPQkn/k7fffps+ffq4zCsJIj179mT37t0sXLiQJUuWcO211xIfH88XX3xxZqWedEjHZDJht9vPovqzN2vWLO6++26+++47Pv30Ux599FEWL15M3759mTJlCjfccAPffvstCxcuZPLkycyZM4crr7yyxut0a7i54IILMAzjlPNPDizLly+v3oLOkslkItvkTyjZ5GYedXc5IiK1k8l01oeG3CUiIoLo6Gh27drFiBEjTtkuMDCQ4cOHM3z4cK6++moGDRrE0aNHCQ0NxdPTE5vNdlZ1tGvXjqKiItavX0+vXr0A2LFjB8eOHav0Ojt06MDs2bPJyclx7r1ZsWIFZrOZdu3aOdv16NGDHj16MHHiROLi4vj444/p27cvAG3btqVt27bce++9XH/99cyaNavhhZv6KNfsD3bIU7gREamXHn/8ce6++26CgoIYNGgQ+fn5rFmzhmPHjjF+/HhefPFFoqKi6NGjB2azmc8//5zIyEiCg4MBxxlTS5cupX///lit1kodSmrfvj3x8fGMGTOGmTNn4unpyX333YePj4+zg++p5OXllbpYYEBAACNGjGDy5MmMGjWKKVOmcPjwYe666y5uvPFGIiIi2L17N2+99Rb//Oc/iY6OZuvWrWzfvp2RI0eSl5fH/fffz9VXX02LFi3Yv38/q1evZtiwYWf83qqCwk0Vy7c4wk1htvrciIjUR7feeiu+vr688MIL3H///fj5+dGlSxfuuecewBEUnn/+ebZv347FYuHcc89lwYIFmM2OPpnTp09n/PjxvP322zRp0oQ9e/ZUqo4PPviAW265hfPPP5/IyEimTp3KX3/9hbe3d7nLbdu2rdRV/S+66CKWLFnC999/z3/+8x/OPfdcfH19GTZsGC+++CIAvr6+bNmyhffff5+0tDSioqIYN24c//73vykqKiItLY2RI0eSmppKWFgYV111FY8//nil3tvZMhnlHReqhzIzMwkKCiIjI4PAwMAqX//vLwylT84y1ne4nx7DH63y9YuI1DXHjx9n9+7dtGjR4rQ/vFJ5+/fvJyYmhiVLlnDRRRe5u5xKKe+7cia/39pzU8XyvUIhB8g94u5SRESkHvvhhx/Izs6mS5cuJCcn88ADDxAbG8v555/v7tLcTuGmihV6O651Y8nTYSkREak+hYWFPPzww+zatYuAgAD69evHRx995LYL59UmCjdVzO7ruEqx53GFGxERqT4JCQkkJCS4u4xaqc7eW6q2Mvs7rj7pXaCzpURERNxB4aaKeQY6buLpW5ju3kJERGqZBnb+ilRCVX1HFG6qmH+o414hAbZ09xYiIlJLlFy5t6CgwM2VSG1X8h0p74aiFaE+N1UsMMxxMzhf8qAgF7zO7lLfIiJ1nYeHB76+vhw+fBhPT0/n9V5ETmS32zl8+DC+vr54eJxdPFG4qWKhIY04bnjibSqkMDMVz7AW7i5JRMStTCYTUVFR7N69m71797q7HKnFzGYzzZo1O+1Vlk9H4aaKBft6sY9QmpNKZupeGinciIjg5eVFmzZtdGhKyuXl5VUle/YUbqqY2WzisDmM5kYquUf20sjdBYmI1BJms1lXKJYaoQOf1SDT03HGVMHRJDdXIiIi0vAo3FSDbO8IAOzpB9xciYiISMOjcFMNCnyjAPDIPujmSkRERBoehZtqYAQ2AcA7N9nNlYiIiDQ8CjfVwLNRLADB+QdAV+QUERGpUQo31cA/qi02w4SPPQeyD7m7HBERkQZF4aYaRDYKZp/hOGOKI9vcW4yIiEgDo3BTDZoE+7DTcNyGoTB1q5urERERaVgUbqpBsK8nSWZHp+Lcg5vdXI2IiEjDonBTDUwmE4d9WjmepGx0bzEiIiINjMJNNUkP7gSAb9omsNvdXI2IiEjDoXBTTUzhbckzvPC05ULaDneXIyIi0mAo3FSTFo2D+MuIdTw5uN6ttYiIiDQkCjfVpHVjfzbaWzieJG9way0iIiINicJNNWkV/ne4MQ6sc3M1IiIiDYfCTTVpEuzDdktrAIzkP9WpWEREpIYo3FQTs9kEYW04bnhiLsqFY7vdXZKIiEiDoHBTjVpGBLHNaOp4krrJvcWIiIg0EAo31ahVuD+J9uaOJ6l/ubcYERGRBkLhphq1jfBnixHjeKJwIyIiUiMUbqpR+8hAthjNADB0WEpERKRGKNxUo2ahvuyxOE4HNx3bA/nZ7i1IRESkAVC4qUZms4mIyGiOGIGOCboNg4iISLVTuKlmHaIC2GVEOZ4o3IiIiFQ7hZtq1i4igF324nBzZLt7ixEREWkAFG6qWfuowBP23CjciIiIVDeFm2rWPjKAXUY0ALbD29xcjYiISP2ncFPNgn29yPKLdTxJ2wmG4dZ6RERE6juFmxrgH9WaQsOCpSgXMg+6uxwREZF6TeGmBrSJCiXJaOx4on43IiIi1UrhpgbodHAREZGa49Zw89NPPzFkyBCio6MxmUzMmzfvtMssX76cnj17YrVaad26NbNnz672Os9W+8hA9hiRABhpO91cjYiISP3m1nCTk5NDt27deO211yrUfvfu3Vx22WVceOGFbNiwgXvuuYdbb72V77//vporPTstw/3YZ3KEm+MpOiwlIiJSnTzc+eKDBw9m8ODBFW7/xhtv0KJFC6ZPnw5Ahw4d+OWXX3jppZdISEiorjLPmqfFTEFgS8gBuw5LiYiIVKs61edm5cqVxMfHu0xLSEhg5cqVp1wmPz+fzMxMl8EdfKLaAeCdnQS2IrfUICIi0hDUqXCTkpJCRESEy7SIiAgyMzPJy8src5mpU6cSFBTkHGJiYmqi1FIim7bguOGJxbBBRpJbahAREWkI6lS4qYyJEyeSkZHhHPbt2+eWOtpHBzs7FZO2yy01iIiINAR1KtxERkaSmprqMi01NZXAwEB8fHzKXMZqtRIYGOgyuEP7yABnuCk8rE7FIiIi1aVOhZu4uDiWLl3qMm3x4sXExcW5qaKKaxxgJdniuMdU5oEtbq5GRESk/nJruMnOzmbDhg1s2LABcJzqvWHDBpKSHH1SJk6cyMiRI53tb7/9dnbt2sUDDzzAli1beP311/nss8+499573VH+GTGZTBQGtQCg8JD23IiIiFQXt4abNWvW0KNHD3r06AHA+PHj6dGjB5MmTQIgOTnZGXQAWrRowbfffsvixYvp1q0b06dP55133qnVp4GfyDOiDQBemXvcW4iIiEg9ZjKMhnWb6szMTIKCgsjIyKjx/jfzf17HP5deiB0z5kdTwcOrRl9fRESkrjqT3+861eemrmvevAXZhjdm7JC+193liIiI1EsKNzWobWQgew3HdXoy9qtTsYiISHVQuKlBPl4WDns1BSAt6S83VyMiIlI/KdzUsLyAWACOp+qMKRERkeqgcFPDLOGOM6Y80ne7uRIREZH6SeGmhgU1cdxAMzhP95cSERGpDgo3NaxJq84AhNmOUJSf6+ZqRERE6h+FmxoWHdWUTMMXs8ngwK7N7i5HRESk3lG4qWFmi5lDnk0ASN2jM6ZERESqmsKNG2T7xwKQl7zNvYWIiIjUQwo3bmBq1MrxeHSnmysRERGpfxRu3MA/2nHGlH+OzpgSERGpago3bhAR2wmAaPtBMvIK3VyNiIhI/aJw4wb+0W0BiDQdY/u+FDdXIyIiUr8o3LiDTwhZ5iAAUnbrjCkREZGqpHDjJpm+zRyPB7e6uRIREZH6ReHGTewhLQEwjuiMKRERkaqkcOMmPlGOfje+2Xuw2w03VyMiIlJ/KNy4SXDT9gA0NZI5kJ7n5mpERETqD4UbN/EIaw1AC1MKicmZbq5GRESk/lC4cZfiqxSHmTLZtT/ZzcWIiIjUHwo37mININcrDID0/YluLkZERKT+ULhxo8KgFgAUHdrh5kpERETqD4UbN/KKcPS78cvZS16Bzc3ViIiI1A8KN27kHeE4HTzWlMz2Q1lurkZERKR+ULhxI1Ojv8+Y2pKscCMiIlIVFG7cqfiMqVhTCokpOh1cRESkKijcuFOIo0NxsCmH/QcOuLkYERGR+kHhxp28fCnwiwIgP3UbhqHbMIiIiJwthRs3sxRfqTgsfx+Hs/LdXI2IiEjdp3DjZpaw4n435hQSU9SpWERE5Gwp3LibyxlT6lQsIiJythRu3C307zOmtmrPjYiIyFlTuHE35+ngqbo7uIiISBVQuHG3kFgMk5kAUx7HDu+n0GZ3d0UiIiJ1msKNu3lYIagpADH2ZHYdznFzQSIiInWbwk0tUHIbhlhzClt0pWIREZGzonBTGxR3Km5hSmGLOhWLiIicFYWb2uCEe0zpdHAREZGzo3BTG2jPjYiISJVRuKkNTthzk5KRS3pugZsLEhERqbsUbmqD4GZgsuBjKiCCY9p7IyIichYUbmoDiyeExALQwqwrFYuIiJwNhZva4sROxTodXEREpNLcHm5ee+01YmNj8fb2pk+fPqxatarc9i+//DLt2rXDx8eHmJgY7r33Xo4fP15D1VajE+4xlZisPTciIiKV5dZw8+mnnzJ+/HgmT57MunXr6NatGwkJCRw6dKjM9h9//DEPPfQQkydPJjExkXfffZdPP/2Uhx9+uIYrrwbFe25amlLYlpqF3W64uSAREZG6ya3h5sUXX+S2227jpptuomPHjrzxxhv4+vry3nvvldn+119/pX///txwww3ExsZyySWXcP311592b0+dENoScPS5yS2wse9YrpsLEhERqZvcFm4KCgpYu3Yt8fHxfxdjNhMfH8/KlSvLXKZfv36sXbvWGWZ27drFggULuPTSS0/5Ovn5+WRmZroMtVLxLRiam1IxY9ehKRERkUpyW7g5cuQINpuNiIgIl+kRERGkpKSUucwNN9zAE088wXnnnYenpyetWrXiggsuKPew1NSpUwkKCnIOMTExVfo+qkxQU7BY8aSIaNMRdSoWERGpJLd3KD4Ty5cv55lnnuH1119n3bp1zJ07l2+//ZYnn3zylMtMnDiRjIwM57Bv374arPgMmC3OfjetTMls0Z4bERGRSvFw1wuHhYVhsVhITU11mZ6amkpkZGSZyzz22GPceOON3HrrrQB06dKFnJwcxowZwyOPPILZXDqrWa1WrFZr1b+B6tCoNRzaTCvTQX7QnhsREZFKcdueGy8vL3r16sXSpUud0+x2O0uXLiUuLq7MZXJzc0sFGIvFAoBh1IOzi8LaAtDKdJC9R3PJLShyc0EiIiJ1j9v23ACMHz+eUaNGcc4559C7d29efvllcnJyuOmmmwAYOXIkTZo0YerUqQAMGTKEF198kR49etCnTx927NjBY489xpAhQ5whp04rDjftPVIwimBbajbdY4LdW5OIiEgd49ZwM3z4cA4fPsykSZNISUmhe/fufPfdd85OxklJSS57ah599FFMJhOPPvooBw4cIDw8nCFDhvD000+76y1UrbA2ALQyHwRgS3Kmwo2IiMgZMhn14nhOxWVmZhIUFERGRgaBgYHuLsdVfhZMbQpA1+Nvc1W/Tkz5Zyc3FyUiIuJ+Z/L7XafOlqr3rAEQEAVAS1OyTgcXERGpBIWb2qbk0JTpIFtSsupHR2kREZEapHBT2xR3Km5tOUh6biGpmfluLkhERKRuUbipbYrDTRer4+ahiTo0JSIickYUbmqb4sNSrZ1nTOlKxSIiImdC4aa2aeQIN+GFB/CgiK3acyMiInJGFG5qm8Am4OmLxbARYzrMlhTtuRERETkTCje1jdnsuMcUjjOmdhzKpqDI7uaiRERE6g6Fm9qouFNxR68UiuwGOw9nu7kgERGRukPhpjYqDjfdfRxnTOlifiIiIhWncFMbhTkOS7U2JQOo342IiMgZULipjYr33EQUJgEGiTodXEREpMLceldwOYVGrQET1sJMGpHJXwesGIaByWRyd2UiIiK1nvbc1EaePhDcDIAOlgOk5RRwMOO4m4sSERGpGxRuaqvGHQHoH3QYgI37091YjIiISN2hcFNbNW4PQA/vFAD+3J/hzmpERETqDIWb2iq8AwAtjX2Awo2IiEhFKdzUVsV7bkJzdwEGf+5PxzAM99YkIiJSByjc1FZhbcFkxiM/nSaWTDKPF5F0NNfdVYmIiNR6Cje1lacPhLQA4KJGaYAOTYmIiFREpcLNvn372L9/v/P5qlWruOeee3jrrbeqrDABGjv63fQJcNyGYeMBhRsREZHTqVS4ueGGG1i2bBkAKSkpXHzxxaxatYpHHnmEJ554okoLbNDCHf1u2psPAvCnTgcXERE5rUqFm02bNtG7d28APvvsMzp37syvv/7KRx99xOzZs6uyvoateM9NVMEeADYdyMRuV6diERGR8lQq3BQWFmK1WgFYsmQJ//znPwFo3749ycnJVVddQ1ccbnzSt+HtaSI7v4jdaTluLkpERKR2q1S46dSpE2+88QY///wzixcvZtCgQQAcPHiQRo0aVWmBDVqj1mCyYMrPZEBEIaBDUyIiIqdTqXDz3HPP8eabb3LBBRdw/fXX061bNwDmz5/vPFwlVcDDCo1aATAw+AgAG5LS3ViQiIhI7Vepu4JfcMEFHDlyhMzMTEJCQpzTx4wZg6+vb5UVJzg6FR/ZRg/vVCCSdQo3IiIi5arUnpu8vDzy8/OdwWbv3r28/PLLbN26lcaNG1dpgQ1ecb+bWCMJgM3JmeQWFLmzIhERkVqtUuHmiiuu4IMPPgAgPT2dPn36MH36dIYOHcrMmTOrtMAGrzjc+KZvIzLQG5vd0MX8REREylGpcLNu3ToGDBgAwBdffEFERAR79+7lgw8+4L///W+VFtjgRXQGwHQokV7NAgBYl3TMnRWJiIjUapUKN7m5uQQEOH5oFy1axFVXXYXZbKZv377s3bu3Sgts8EJbgocPFOZyQVg2AOv2KtyIiIicSqXCTevWrZk3bx779u3j+++/55JLLgHg0KFDBAYGVmmBDZ7ZAhGdADjH23HLi3VJukO4iIjIqVQq3EyaNIkJEyYQGxtL7969iYuLAxx7cXr06FGlBQoQ2QWAmIJdeFnMHM0pYE+a7hAuIiJSlkqFm6uvvpqkpCTWrFnD999/75x+0UUX8dJLL1VZcVIs0tHvxuPQJro0DQJ0aEpERORUKhVuACIjI+nRowcHDx503iG8d+/etG/fvsqKk2KRXR2PKRvp2SwYgLXqVCwiIlKmSoUbu93OE088QVBQEM2bN6d58+YEBwfz5JNPYrfbq7pGadwRMEF2CnERjr42a/YcdW9NIiIitVSlrlD8yCOP8O677/Lss8/Sv39/AH755RemTJnC8ePHefrpp6u0yAbP6u+4DUPaDs7xOQDAttRs0rLzaeRvdXNxIiIitUulws3777/PO++847wbOEDXrl1p0qQJd9xxh8JNdYjoDGk7CExPpF1ED7amZrFq91EGd4lyd2UiIiK1SqUOSx09erTMvjXt27fn6FEdLqkWxWdMkbKJvi1DAfhtV5obCxIREamdKhVuunXrxowZM0pNnzFjBl27dj3roqQMzk7Ff9K3ZSMAft+tICkiInKySh2Wev7557nssstYsmSJ8xo3K1euZN++fSxYsKBKC5RiUcXh5sg2ejdx9LPZkpLF0ZwCQv283FiYiIhI7VKpPTcDBw5k27ZtXHnllaSnp5Oens5VV13FX3/9xYcffljVNQpAQCQERIFhp1HWVtpG+AOwarcOTYmIiJyoUntuAKKjo0t1HP7jjz949913eeutt866MClDdE/Y+i0cXEfflgPZlprNb7uOMqizOhWLiIiUqPRF/MQNmhTf2uLAOme/G3UqFhERceX2cPPaa68RGxuLt7c3ffr0YdWqVeW2T09PZ9y4cURFRWG1Wmnbtm3D6ecT3dPxeHAdfVo4zpjakpLFkex8NxYlIiJSu7g13Hz66aeMHz+eyZMns27dOrp160ZCQgKHDh0qs31BQQEXX3wxe/bs4YsvvmDr1q28/fbbNGnSpIYrd5Po4j03R3fRyJJLp2jHHdh/2X7EjUWJiIjULmfU5+aqq64qd356evoZvfiLL77Ibbfdxk033QTAG2+8wbfffst7773HQw89VKr9e++9x9GjR/n111/x9PQEIDY29oxes07zDYWQWDi2Bw6u5/y20fx1MJOfth1maI8GEvBERERO44z23AQFBZU7NG/enJEjR1ZoXQUFBaxdu5b4+Pi/izGbiY+PZ+XKlWUuM3/+fOLi4hg3bhwRERF07tyZZ555BpvNdsrXyc/PJzMz02Wo05yHptYzoE0YAD9tP4LdbrixKBERkdrjjPbczJo1q8pe+MiRI9hsNiIiIlymR0REsGXLljKX2bVrFz/88AMjRoxgwYIF7NixgzvuuIPCwkImT55c5jJTp07l8ccfr7K63a5JT/hrLhxYxzlx9+LrZeFIdj6JKZl0ig5yd3UiIiJu5/YOxWfCbrfTuHFj3nrrLXr16sXw4cN55JFHeOONN065zMSJE8nIyHAO+/btq8GKq8EJe268PMzEFZ819dM29bsREREBN4absLAwLBYLqampLtNTU1OJjIwsc5moqCjatm2LxWJxTuvQoQMpKSkUFBSUuYzVaiUwMNBlqNOiuoHJDJkHICuV89uGA/DTtsNuLkxERKR2cFu48fLyolevXixdutQ5zW63s3TpUuctHU7Wv39/duzYgd1ud07btm0bUVFReHk1kFsQWP0hrJ1j/MBaZ7hZs/coOflFbixMRESkdnDrYanx48fz9ttv8/7775OYmMjYsWPJyclxnj01cuRIJk6c6Gw/duxYjh49yn/+8x+2bdvGt99+yzPPPMO4cePc9Rbco+k5jsd9vxPbyJeYUB8KbQYrdujQlIiISKVvv1AVhg8fzuHDh5k0aRIpKSl0796d7777ztnJOCkpCbP57/wVExPD999/z7333kvXrl1p0qQJ//nPf3jwwQfd9Rbco1lfWP8h7Psdk8nERe0jmP3rHpYkpnJJp7IP6YmIiDQUJsMwGtQ5xJmZmQQFBZGRkVF3+9+k7YRXe4LFChP3sWJPFiPe+Z1Gfl6seiQei9nk7gpFRESq1Jn8fteps6WkWGhL8A0DWz4c3EDvFqEEeHuQllPAhn3H3F2diIiIWync1EUmk+PQFEDSSjwtZi5s1xiAxZvLvnWFiIhIQ6FwU1eVhJt9vwNwcUdHP6XFm1PcVZGIiEitoHBTV8WU7Ln5DQyDge3C8bSY2Hk4h12Hs91bm4iIiBsp3NRVUd3AwxvyjsKR7QR6e9K3+GrFCzdp742IiDRcCjd1lYcXNOnlGE9y3Gj0si5RAHzzZ7K7qhIREXE7hZu6rHl/x+PunwBI6BSJh9lEYnImO3VoSkREGiiFm7qs5UDH4+4fwTAI8fOif+swAL75Q3tvRESkYVK4qcuangsePpBzGA5tBuDyro5DU99uPOjOykRERNxG4aYu87BC8+KbjO76EYBLOkXiZTGzLTWbbalZbixORETEPRRu6rqWFzgedzvCTZCPJ+e3LTk0pb03IiLS8Cjc1HUtivvd7FkBtiIALu8aDcC8DQdpYLcOExERUbip8yK7gk8IFGTBwXUAXNIpAj8vC0lHc1m9R/eaEhGRhkXhpq4zmyF2gGN813IAfL08uLT4mjf/t3a/mwoTERFxD4Wb+qD1RY7H7Yudk67u1RSAbzcmk1tQ5I6qRERE3ELhpj5ofbHjcf9qyEkD4NzYUJqF+pKdX8T3f+l2DCIi0nAo3NQHQU0gojNgwM6lAJjNJq7q2QSA/1t7wI3FiYiI1CyFm/qizSWOx23fOycN6+k4NLVi5xEOpOe5oyoREZEap3BTX5SEmx1LwG4DICbUl74tQzEM+Gz1PjcWJyIiUnMUbuqLpueCdzAcT3f0vSl2Q5/mAMxZnUShze6e2kRERGqQwk19YfE44aypRc7JgzpF0sjPi9TMfJYmHnJTcSIiIjVH4aY+cfa7+TvceHmYufbcGAA++n2vO6oSERGpUQo39UnreMAEqRsh/e8+Njf0bobJBD9vP8KeIznuq09ERKQGKNzUJ35h0Kz4LuGJ852TY0J9Gdg2HIBPViW5ozIREZEao3BT33Qa6nj8a57L5BHFHYs/W7OP44W2mq1JRESkBinc1Dcd/ul43L8KMv6+r9Q/2jemSbAPx3IL+WqDLuonIiL1l8JNfRMYBTF9HeOb/z40ZTGbGNXPsffmnZ93YxiGO6oTERGpdgo39VHJoanNX7lMvq53M/ytHmw/lM3ybYdrvi4REZEaoHBTH5Ucmtr3G2T8fQgq0NuT4cWnhb/z8y53VCYiIlLtFG7qo6Amf581tfFzl1k39Y/FYjaxYkcafx3McENxIiIi1Uvhpr7qdp3j8Y85cEL/mqYhvgzuHAnAuz/vdkdlIiIi1Urhpr7qOBQsVjicCCl/usy6dUBLAOb/cZDkDN0tXERE6heFm/rKJxjaDXaM/zHHZVb3mGB6twilyG7w5o/qeyMiIvWLwk191u16x+PGz8FW6DLrrn+0BhxXLD6UdbymKxMREak2Cjf1WeuLwDcMcg7Dzh9cZp3XOozuMcHkF9nV90ZEROoVhZv6zOIJXa5xjK//0GWWyWRy7r358Le9HMspqOnqREREqoXCTX3X80bH45YFkJnsMusf7RvTMSqQ3AIb763Q3hsREakfFG7qu4hOjtsxGDZY/z+XWSfuvZm9Yg8ZeYVlrUFERKROUbhpCM652fG4djbYXe8IntApkrYR/mTlF/HuL9p7IyIidZ/CTUPQ8QrwCYXM/bB9kcsss9nEvfFtAXj3512kZee7o0IREZEqo3DTEHh6Q48RjvHV75aaPahzJF2aBJFTYGPm8p01XJyIiEjVUrhpKHrdBJhgx2I4st1llslkYkJCOwA++G2vrlosIiJ1msJNQ9GoFbQd5Bj/bWap2ee3CaN3i1AKiuz8d+mOGi5ORESk6ijcNCRxdzgeN3wMuUddZplMJu4v3nvz2Zp97DmSU9PViYiIVIlaEW5ee+01YmNj8fb2pk+fPqxatapCy82ZMweTycTQoUOrt8D6InYARHaBojxYO6vU7HNjQ7mwXTg2u8GLi7e5oUAREZGz5/Zw8+mnnzJ+/HgmT57MunXr6NatGwkJCRw6dKjc5fbs2cOECRMYMGBADVVaD5hM0HecY3zV21BU+qrE913i2Hsz/4+D/Lk/vQaLExERqRpuDzcvvvgit912GzfddBMdO3bkjTfewNfXl/fee++Uy9hsNkaMGMHjjz9Oy5Yta7DaeqDzMPCPgKxk2Dyv9OwmQVzZowkAT32biGEYNVygiIjI2XFruCkoKGDt2rXEx8c7p5nNZuLj41m5cuUpl3viiSdo3Lgxt9xyy2lfIz8/n8zMTJehQfPwgnNvc4yvfA3KCC/3J7TD6mFm1e6jLNqcWsMFioiInB23hpsjR45gs9mIiIhwmR4REUFKSkqZy/zyyy+8++67vP322xV6jalTpxIUFOQcYmJizrruOu+cm8HDG5I3wO6fSs2ODvbhtgGOPWJTFyRSUGSv4QJFREQqz+2Hpc5EVlYWN954I2+//TZhYWEVWmbixIlkZGQ4h3379lVzlXWAXyPoOdIx/tMLZTa5/YJWhPlb2ZOWy/9+21uDxYmIiJwdt4absLAwLBYLqamuhz5SU1OJjIws1X7nzp3s2bOHIUOG4OHhgYeHBx988AHz58/Hw8ODnTtLX13XarUSGBjoMgjQ/z9g9oQ9P8OeFaVm+1s9GH+x47YMryzdTnpu6c7HIiIitZFbw42Xlxe9evVi6dKlzml2u52lS5cSFxdXqn379u3ZuHEjGzZscA7//Oc/ufDCC9mwYYMOOZ2JoKbQ41+O8Z+eL7PJtec0pV1EABl5hbz6gy7sJyIidYPbD0uNHz+et99+m/fff5/ExETGjh1LTk4ON910EwAjR45k4sSJAHh7e9O5c2eXITg4mICAADp37oyXl5c730rdc969YPaAXcthX+lrC3lYzDx8WQcA3v91D9tTs2q4QBERkTPn9nAzfPhwpk2bxqRJk+jevTsbNmzgu+++c3YyTkpKIjk52c1V1lMhzaHb9Y7xH8veezOwbTjxHSIoshtM+fovnRouIiK1nsloYL9WmZmZBAUFkZGRof43AEd3wavngGGDW5ZAzLmlmiSl5RL/0o8UFNl57YaeXNY1yg2FiohIQ3Ymv99u33MjbhbaEroX771ZMqXM6940a+TL2IGtAHjq283kFhTVYIEiIiJnRuFGYOBDYLHC3l9gx9Iym4y9oBVNQ3xIzjjODHUuFhGRWkzhRiA4BnoXX7V46RSwl75on7enhUmXdwTg7Z93setwdg0WKCIiUnEKN+Jw3niwBkLKRvhrbplNLu4YwQXtwim0GUz5erM6F4uISK2kcCMOfo2g392O8R+eAlthqSYmk4nJQzrhZTHz07bDLNhY9i0yRERE3EnhRv7Wdyz4hcOx3bB2dplNWoT5cfsFjs7FU77+i4y80iFIRETEnRRu5G9Wfxj4oGN82TOQd6zMZndc0IqW4X4czsrn2YVbarBAERGR01O4EVe9boLwDpB3FJY/V2YTb08LU6/sAsAnq5JYtftoTVYoIiJSLoUbcWXxgEFTHeOr3oLDW8ts1qdlI67v7biX18S5f5JfZKupCkVERMqlcCOltboQ2l3quGrx9w+fstlDgzoQ5m9l5+EcXl9W+o7sIiIi7qBwI2W75Ckwe8KOJbBtUZlNgnw9mfJPx7VvXl++gx2HdGNNERFxP4UbKVujVtD3dsf49xOhKL/MZpd1ieKi9o0ptBk88MWf2Oy69o2IiLiXwo2c2vn3g38EpO2AFf8ts4nJZOLJoZ0JsHqwLimdd37eVcNFioiIuFK4kVPzDoKEZxzjP70AaWX3q4kO9uGx4lszTF+8je2pOjwlIiLuo3Aj5es8DFpeCLZ8WDChzLuGA1xzTlMubBdOQZGd+z7/gyJb6ftTiYiI1ASFGymfyQSXTXfcNXznD/DXl6doZuLZYV0J9Pbgz/0ZvPGjzp4SERH3ULiR02vUCgaMd4x/NxGOZ5TZLCLQm8ev6ATAK0u3s/lgZk1VKCIi4qRwIxXT/x4IbQXZKY4ba57C0O5NuLhjBIU2g/s+/0MX9xMRkRqncCMV4+kNl7/oGF/1NiT9VmYzk8nEM1d2IcTXk8TkTKZ9X/YVjkVERKqLwo1UXMsLoPu/AAO+GgeFeWU2Cw+w8tywrgC8/fNuftx2uOZqFBGRBk/hRs5MwtPgH+m49s3yqadsdkmnSG7s2xyA+z77gyPZZV8EUEREpKop3MiZ8QmGy19yjP/6KhxYe8qmj1zWgbYR/hzJzuf+z//AOMVp5CIiIlVJ4UbOXPtLocs1YNhh3rhT3prB29PCf6/vgZeHmWVbDzP71z01W6eIiDRICjdSOYOfB79wOJxY7tlT7SMDefSyDgBMXbCFTQfKPo1cRESkqijcSOX4hsKQ4vtN/foq7PnllE1v7NuciztGUGCzM/ajtWTkFtZQkSIi0hAp3Ejltb8Ueo4EDPjy9lNe3M9kMjHt6m40C/Vl39E87v1sA3bdPVxERKqJwo2cnYRnICQWMvbBggdO2SzI15OZ/+qJ1cPMD1sO8dqyHTVXo4iINCgKN3J2rAFw5VtgMsOfc0557ymATtFBPDm0MwAvLtnGz9t1/RsREal6Cjdy9pr1gfOK7z319T2Qvu+UTa89J4brzo3BMODuT9az/1huzdQoIiINhsKNVI0LHoLoHnA8Hf7vFrCdutPwlH92okuTII7lFnLr+2vIyS+quTpFRKTeU7iRqmHxhKtngTUQ9v0OPzx5yqbenhbevLEXYf5WtqRk8Z856mAsIiJVR+FGqk5oC7hihmN8xSuw7ftTNo0O9uGtkb3w8jCzJDGV53WDTRERqSIKN1K1Ol4Bvcc4xr+8HTL2n7Jpz2YhvHC14wabb/y4ky/WnrqtiIhIRSncSNW75CmI6gZ5R+GLm8vtf3NF9ybceWFrAB6eu5GVO9NqqkoREamnFG6k6nlY4ZrZf/e/WfRYuc3HX9yWwZ0jKbDZGfPBGjYfzKyZOkVEpF5SuJHqEdoShr7uGP99Jmz4+JRNzWYTLw3vTu/YULLyixg1axX7juoUcRERqRyFG6k+HYbAwAcd41/fAwfWnrKpt6eFt0edQ/vIAA5n5TPqvVWkZZd9t3EREZHyKNxI9Rr4ELQdDLZ8mPMvyD50yqZBPp68f3NvmgT7sOtIDjfPXk3Wcd1kU0REzozCjVQvsxmuegvC2kLWQfhsJBQVnLJ5RKA379/cmxBfT/7Yn8HoWavJ1kX+RETkDCjcSPXzDoTrPnZ0ME5aCfPvAuPUF+1r3difD2/pQ6C3B2v3HuPmWavJLVDAERGRilG4kZoR1sZxBpXJ4rjB5vJny23euUkQH97ShwCrB6v2HOWW2WvIK7DVTK0iIlKnKdxIzWl9EVz+omP8x2fLPYMKoFtMMO/f0ht/qwcrd6Vxy/urdR8qERE5LYUbqVm9RsN59zrG598Nu34st3nPZiHMvulc/Lws/LozjRHv/E567qn77IiIiCjcSM37xyTodBXYC2HOCDiwrtzm58SG8vFtfQn29WTDvnSGv/kbhzKP11CxIiJS19SKcPPaa68RGxuLt7c3ffr0YdWqVads+/bbbzNgwABCQkIICQkhPj6+3PZSC5nNMHQmxA6Agiz43zA4tKXcRbrFBPPZv+NoHGBla2oW17y5kqQ0XehPRERKc3u4+fTTTxk/fjyTJ09m3bp1dOvWjYSEBA4dKvt6KMuXL+f6669n2bJlrFy5kpiYGC655BIOHDhQw5XLWfH0hus/gSa9HPeg+uAKOLq73EXaRgTwxe39aBbqy960XK58fQVr9x6roYJFRKSuMBlGOefk1oA+ffpw7rnnMmPGDADsdjsxMTHcddddPPTQQ6dd3mazERISwowZMxg5cuRp22dmZhIUFERGRgaBgYFnXb+cpdyjMPsyOLQZgpvDzd9BYHS5ixzKPM7N769m04FMvDzMTL+mG0O6lb+MiIjUbWfy++3WPTcFBQWsXbuW+Ph45zSz2Ux8fDwrV66s0Dpyc3MpLCwkNDS0zPn5+flkZma6DFKL+IbCjV9CSAtI3wvvD4HMg+Uu0jjQm8/+HUd8hwgKiuzc9cl6ZvywHTfndBERqSXcGm6OHDmCzWYjIiLCZXpERAQpKSkVWseDDz5IdHS0S0A60dSpUwkKCnIOMTExZ123VLGASBj5FQQ1g7QdMOtSSN9X7iK+Xh68eWMvbjmvBQDTFm1j3MfrdDVjERFxf5+bs/Hss88yZ84cvvzyS7y9vctsM3HiRDIyMpzDvn3l/2iKm4Q0h5u+hZBYOLYbZl8Kx/aUu4jFbOKxyzvy1NDOeJhNLNiYwhUzfmHHoewaKVlERGont4absLAwLBYLqampLtNTU1OJjIwsd9lp06bx7LPPsmjRIrp27XrKdlarlcDAQJdBaqngZjB6AYS2gvQkmHUZHNl+2sX+1bc5n/67LxGBVnYezuGKGb+wYGNyDRQsIiK1kVvDjZeXF7169WLp0qXOaXa7naVLlxIXF3fK5Z5//nmefPJJvvvuO84555yaKFVqSlATGP2t40abmfvh3Yth3+lP9e/VPJRv7hpAnxah5BTYuOOjdTz85Ubdk0pEpAFy+2Gp8ePH8/bbb/P++++TmJjI2LFjycnJ4aabbgJg5MiRTJw40dn+ueee47HHHuO9994jNjaWlJQUUlJSyM7WoYh6IzAKbloI0T0h7xi8/0/YsuC0i4UHWPno1j78e2BLAD7+PYnL//sLG/dnVHfFIiJSi7g93AwfPpxp06YxadIkunfvzoYNG/juu++cnYyTkpJITv77EMPMmTMpKCjg6quvJioqyjlMmzbNXW9BqoNfGIz+BtpcAkV58OkIWDPrtIt5WMxMHNyBj27tQ2SgN7uO5HDl6yt4bdkOimz2GihcRETcze3Xualpus5NHWMrgm/+A+v/53je53a45CmweJ520fTcAibO3cjCTY4z7zpFB/LcsK50bhJUnRWLiEg1qDPXuRE5LYsH/HMGXPCw4/nvb8D/rnJc/O80gn29eH1ET6Zd040gH0/+OpjJFa+tYOrCRPIKbNVcuIiIuIv23EjdkfgNfPlvKMh2nFl13ccQ2aVCix7Oyufxr//imz8dhzibhfry6GUduLhjBCaTqTqrFhGRKnAmv98KN1K3pG6GOdc7roHj4Q0Jz8A5N0MFA8qSzak89tUmkjMcdxUf0CaMxy7vSNuIgGosWkREzpbCTTkUbuqB3KOOPTjbFzmed/gn/PNV8Amu0OLZ+UW8vmwH7/y8mwKbHYvZxL/6NOPui9rQyN9afXWLiEilKdyUQ+GmnrDb4bfXYckUsBc6bt1w1ZvQvF+FV5GUlstT325m0WbHRST9vCzcfF4Lbh3QkiCf03dYFhGRmqNwUw6Fm3rmwFr44ubiWzWYoM+/4aJJ4OVX4VWs2HGEZxduYeMBx/Vwgnw8+ffAlozuF4uvl0f11C0iImdE4aYcCjf10PFMWPQIrPvA8Twk1nGGVYsBFV6FYRh8/1cK0xZtc96bKsTXk5FxsYzqF0uon1c1FC4iIhWlcFMOhZt6bMcSmP8fx20bALrdABc/Dv6NK7wKm93gqw0HeGXpdvam5QLg42lh+Lkx3DqgBU1DfKujchEROQ2Fm3Io3NRzxzNh8WOwdrbjuTUQLpgIvW+r0IX/StjsBgs3JfPGjzvZdCATcNyF/JKOEdzYtzlxrRrpFHIRkRqkcFMOhZsGYv8a+PY+SN7geB7eAS55ElrHV/i0cXAcrlqxI403ftzJLzuOOKe3DPfjxr7NuapnU3U+FhGpAQo35VC4aUDsNlj/ISx5HPKKr2jcrB/ET4Zmfc94dVtTsvjfb3uZu24/OcVXOPb2NDOoUyRX9WxK/9ZhWMzamyMiUh0UbsqhcNMA5R6Fn6fDqrfBlu+Y1iYB/vEIRHU749Vl5xfx5foDfLhyD9tS/74bfUSglaE9mjCsZ1NdFFBEpIop3JRD4aYBy9gPPz4H6z8Co/jeUq0ugvPuhdjzzuhwFTgOWW3Yl87cdQeY/8dBMvIKnfPaRvgzuHMUg7tE0i4iQP1zRETOksJNORRuhCM7YPlU+GsuGHbHtCa9IO5O6DDkjDoel8gvsrFsy2HmrtvPsq2HKLT9/WfVIsyPQZ0jSegUSdcmQZh16EpE5Iwp3JRD4Uacju6GlTNg/f+gyHGvKfwjodco6DUaAqMrtdqM3EKWJKaycFMKP20/TEGR3TmvkZ8X57cN54J24ZzfJpwQXT9HRKRCFG7KoXAjpWQfhlVvOU4fzznkmGayQLvB0H2E4wwrj8qFkOz8IpZtOcR3m1L4cdthsvOLnPNMJujWNJiBbcOJa9WI7jHBeHtaquANiYjUPwo35VC4kVMqKoAtX8Pqd2Hvir+n+4RC52HQ7TrH4atK9p8pKLKzdu8xlm87xI9bD7MlJctlvpeHmZ7NgunTohF9WzaiRzOFHRGREgo35VC4kQpJ3ew4XLXx87/35oDj1g7tL3fcibzpuWA2V/olkjPy+HHrYVbsTOO3XWkczsp3me/lYaZzdCA9moXQPSaYHs2CaRLso87JItIgKdyUQ+FGzoitCHYthz8/hS3fQGHu3/P8I6DdpdD+MmjeH7wqf2sGwzDYdSSH33al8fuuo6wsI+wAhPlb6dEsmO4xwXRtGkSHqEDC/K2Vfl0RkbpC4aYcCjdSafnZsHMpJH4N276H/My/51m8oFkctPqHY4jofFZ7dQzDYE9aLuuTjrFhXzrrk9JJTM6kyF76z7VxgJWO0YF0jAp0PsY28tNZWSJSryjclEPhRqpEUQHs+QkSv3HcsDNjn+t8v3DH3pxmcY6rIUd0BovHWb3k8UIbmw5kOMLOvnQSD2ayOy2Hsv6CfTwttG7s7xxahTsemzfyxdNS+dAlIuIuCjflULiRKmcYkLYDdv7gGHb/DIU5rm28AiDmXIjpC9E9ILr7Gd2t/FRy8ovYkpLF5uRMNh/MZHNyJluSM8k/4fTzE3laTDRv5EfrcH9aNfajeagfzRr50ryRLxEB3trbIyK1lsJNORRupNoVFcD+1ZC0EpJ+g32/ux7CKhEQ7Qg5Ud0djxGdHdfWOcsOw0U2O3vSctlxKJudh7PZcejvIa/QdsrlvDzMNAv1pXmoryPwhPrSvJEfTUJ8iAryJsBbNwgVEfdRuCmHwo3UOLsNDiU6ws7+1XBwAxzZBpTxp2cNhPB2EN4eGncoHu9QJaHHbjc4mJHnDDp70nLYm5ZL0tFcDhzLK7M/z4kCvD2IDvIhOtibqGAfmgQ7Qk9UkGM8IsiK1UOnrotI9VC4KYfCjdQK+VmQstERdJI3OB7Tdvx9z6uTeflDSAsIjS1+bAmhLRzjQU3BfHahoshm52D6cfYe/Tvw7E3LIeloHgfT81zum1WeUD8vwv2tNA60Eh7gGBoHeNPYOW6lcaA3fl4WndIuImdE4aYcCjdSaxUVOALO4UQ4vNWxt+fwFkjbeerQA2D2cBziCoyGoCYQWDycOO4XflZnb+XkF5GckceB9OMkp+dxMOM4B9PzSM7I42C6Y/xU/XzK4uNpoXGglUZ+XoQWDyF+XjTy8yLE14tG/sWPflZC/Dzxt3ooDIk0cAo35VC4kTqnqADS9zruhXV0Fxwrfjy62zHdVnD6dZgsjoATEOG4Po9/4+LHCPBtBD4h4BvquBqzb6hjT9EZhAnDMDiWW8ihrOMczsrnUGY+h7LyHeNZxzmUlc+RLMe0E29BUVFeFjMhfp7O4BPs40WgjweBPp4ElTEEehc/+nhiUSdpkXpB4aYcCjdSr9htkJUCmQcgYz9kHjxh/ABkHIDsVMrs31Mes+ffYccZfE54tAYWDwHgXfxoDfh7Wjl3Vs8tKCoOPfmkZRdwNKeAY7kFpGUXP+YUcCzHMf1oTkG5naArIsDqGoICfTwI9PbEz+pBgLcH/lYP/EserX8/D7B64me14O/tob5EIrWAwk05FG6kwbEVQs5hR8jJPlT8mApZxY95xyD3KOQddTzaSl8Z+Yx5+JwQePzB0w88fRyDV8l48aOXL3ieMHj5usw/bniSXmji2HETRwtMpB2HY8dNpOdDRl4hmccLychzDJl5f4/nFpxdKDqRl8XsDEB+Vg8CTghEflYLPp4e+HpZ8PGy4Otlwc/LwznuePTA74RxXy8LVg+zDrWJnIEz+f0+u6uKiUjtZ/F09McJjD59W8Nw3GLi5MCTdxRyjzke89Idp7bnZxUPJ4yX3J6iKM8xnHhfrkryBiKLBxcmM1is4FE8WKyOu7cHe0MjL+wWL4pMnhSavCjEk3zDQoFhId9udhny7GaO28zk2czk2UzkFJnJK4KcIhO5NjOFWCgyLBTleVCYZ8GGmSIs2LBwFDOHMWM3zNgwYceMrXg41bjNcKwDkxkvL0+8PD2xFj/6WL3w9vLE6umJj9UTHy8LVg8L3p4WvD3NjkeP4sfiaVZPC94ejuDk7WnG+6T2ClHSECnciMjfTCbHnhUvP8dZWGfKVgQFWXD8hMBTkO0IPQW5jsfCXCjMg4Icx2Nh8WOZ83OhKN/Rr6joOBgndFo27H+HqDKYAa/ioVLMxUN1KyoeymA3TNhxDMYJg704SBlwimlm8oE8zNgxAabiPlQmDJMjWJlMJgyKg4/JjMlsdrQxnTDN5DrfbDIVTwOzCUwmc/GjqfRzs8nRHorD1QkBq7ywVWpeecudat5J7dw6j/LnnXjwxHB8epUfx/G8zPGKtDnbcf4ej+wCV75R5iaoCQo3IlJ1LB6OPjk+IdWzfluR47DZiYGnqKB4WvHzkvGSdkX5jnFboWOwFzrWYy8qHi90jFd4XvE0u80RsOw2x9lszkf7Sc9dpxvFy5rKOwOumNlkFMeVKnTC76FItfGs/I2Eq4LCjYjUHRYPx+Dl5+5KKs3l/+2nCUIYRvHequLHkufOaSdN5+/5RTY7BUVFFBQWkV9oKx53PBaWPC8qorDIXvzoGLfZbBTZbBTZ7BTZiiiy2bEV2RzT7XYK7XaKbEbxfDtFdsP53DHPTqHNsT+pzPcMJ80zTpp3soq3PZP1mkyO+RazCU+zCUvJYDLhaXGMe5jBYjZjMYNH8TwPi2Oah6m4jQUsJeMnrqdk+eK9WB5mx94ti8mExWLCYgJz8XrMFsdeMQ+LCbPJXLy8GbPFXNzejMXkqMNkMp+w98fkOu54YyeNc4rpJ7cxnfl4qXWdMN07GHdSuBERcRdz8bGvcs4uqyyP4sEd/382DINCm0F+kY2CIjsFNjv5hSc+2sgvtJNvs1NQZCe/qOTR5vK80OZYpmS8sMgoPc1mUGArGS+Zbpww/+/nBbYyrsVUdf3Oa4SH2RGCPM1mPD3MeJhNeFrMeFpMeFgcz72Kp3tYzHhZzHhYTHiYzXh5OB49LKaTpp/Y/uT1lNX+5PW4vq6nxYyPl4Uwd24nN762iIjUQyaTCS8Px49dbWIYBkV2wxmUCoqDUuEJQerEYPT3PMMZnE4MUieGpr/XYZwQshxtiuwnjNv+Xp9jr5fr88KiE/aOlXFLlCK7Y/px7FAFJzZWl24xwXw1rr/bXl/hRkREGgRT8SEnT4v5LHqa15ySPWBF9uK9VsWhpyRcFdkd4ap0SDph3NneOGn6SaGqZJ0247SBzHV6Wes28PF0b7BVuBEREamFnHvAqBthrDapXfsMRURERM6Swo2IiIjUKwo3IiIiUq8o3IiIiEi9onAjIiIi9YrCjYiIiNQrCjciIiJSr9SKcPPaa68RGxuLt7c3ffr0YdWqVeW2//zzz2nfvj3e3t506dKFBQsW1FClIiIiUtu5Pdx8+umnjB8/nsmTJ7Nu3Tq6detGQkIChw4dKrP9r7/+yvXXX88tt9zC+vXrGTp0KEOHDmXTpk01XLmIiIjURibDMErfvKIG9enTh3PPPZcZM2YAYLfbiYmJ4a677uKhhx4q1X748OHk5OTwzTffOKf17duX7t2788Ybb5z29TIzMwkKCiIjI4PAwMCqeyMiIiJSbc7k99ute24KCgpYu3Yt8fHxzmlms5n4+HhWrlxZ5jIrV650aQ+QkJBwyvb5+flkZma6DCIiIlJ/uTXcHDlyBJvNRkREhMv0iIgIUlJSylwmJSXljNpPnTqVoKAg5xATE1M1xYuIiEit5PY+N9Vt4sSJZGRkOId9+/a5uyQRERGpRm69K3hYWBgWi4XU1FSX6ampqURGRpa5TGRk5Bm1t1qtWK3WqilYREREaj23hhsvLy969erF0qVLGTp0KODoULx06VLuvPPOMpeJi4tj6dKl3HPPPc5pixcvJi4urkKvWdJ/Wn1vRERE6o6S3+0KnQdluNmcOXMMq9VqzJ4929i8ebMxZswYIzg42EhJSTEMwzBuvPFG46GHHnK2X7FiheHh4WFMmzbNSExMNCZPnmx4enoaGzdurNDr7du3zwA0aNCgQYMGDXVw2Ldv32l/69265wYcp3YfPnyYSZMmkZKSQvfu3fnuu++cnYaTkpIwm//uGtSvXz8+/vhjHn30UR5++GHatGnDvHnz6Ny5c4VeLzo6mn379hEQEIDJZKrS95KZmUlMTAz79u3TaeanoW1VcdpWFadtdWa0vSpO26riqmtbGYZBVlYW0dHRp23r9uvc1Ce6hk7FaVtVnLZVxWlbnRltr4rTtqq42rCt6v3ZUiIiItKwKNyIiIhIvaJwU4WsViuTJ0/WqecVoG1VcdpWFadtdWa0vSpO26riasO2Up8bERERqVe050ZERETqFYUbERERqVcUbkRERKReUbgRERGRekXhpoq89tprxMbG4u3tTZ8+fVi1apW7S6pxU6ZMwWQyuQzt27d3zj9+/Djjxo2jUaNG+Pv7M2zYsFI3QU1KSuKyyy7D19eXxo0bc//991NUVFTTb6XK/fTTTwwZMoTo6GhMJhPz5s1zmW8YBpMmTSIqKgofHx/i4+PZvn27S5ujR48yYsQIAgMDCQ4O5pZbbiE7O9ulzZ9//smAAQPw9vYmJiaG559/vrrfWpU73bYaPXp0qe/ZoEGDXNo0lG01depUzj33XAICAmjcuDFDhw5l69atLm2q6u9u+fLl9OzZE6vVSuvWrZk9e3Z1v70qVZFtdcEFF5T6bt1+++0ubRrCtpo5cyZdu3YlMDCQwMBA4uLiWLhwoXN+nfhOnem9oKS0OXPmGF5eXsZ7771n/PXXX8Ztt91mBAcHG6mpqe4urUZNnjzZ6NSpk5GcnOwcDh8+7Jx/++23GzExMcbSpUuNNWvWGH379jX69evnnF9UVGR07tzZiI+PN9avX28sWLDACAsLMyZOnOiOt1OlFixYYDzyyCPG3LlzDcD48ssvXeY/++yzRlBQkDFv3jzjjz/+MP75z38aLVq0MPLy8pxtBg0aZHTr1s347bffjJ9//tlo3bq1cf311zvnZ2RkGBEREcaIESOMTZs2GZ988onh4+NjvPnmmzX1NqvE6bbVqFGjjEGDBrl8z44ePerSpqFsq4SEBGPWrFnGpk2bjA0bNhiXXnqp0axZMyM7O9vZpir+7nbt2mX4+voa48ePNzZv3my8+uqrhsViMb777rsafb9noyLbauDAgcZtt93m8t3KyMhwzm8o22r+/PnGt99+a2zbts3YunWr8fDDDxuenp7Gpk2bDMOoG98phZsq0Lt3b2PcuHHO5zabzYiOjjamTp3qxqpq3uTJk41u3bqVOS89Pd3w9PQ0Pv/8c+e0xMREAzBWrlxpGIbjR81sNjtvmmoYhjFz5kwjMDDQyM/Pr9baa9LJP9h2u92IjIw0XnjhBee09PR0w2q1Gp988olhGIaxefNmAzBWr17tbLNw4ULDZDIZBw4cMAzDMF5//XUjJCTEZVs9+OCDRrt27ar5HVWfU4WbK6644pTLNNRtZRiGcejQIQMwfvzxR8Mwqu7v7oEHHjA6derk8lrDhw83EhISqvstVZuTt5VhOMLNf/7zn1Mu01C3lWEYRkhIiPHOO+/Ume+UDkudpYKCAtauXUt8fLxzmtlsJj4+npUrV7qxMvfYvn070dHRtGzZkhEjRpCUlATA2rVrKSwsdNlO7du3p1mzZs7ttHLlSrp06eK8aSpAQkICmZmZ/PXXXzX7RmrQ7t27SUlJcdk2QUFB9OnTx2XbBAcHc8455zjbxMfHYzab+f33351tzj//fLy8vJxtEhIS2Lp1K8eOHauhd1Mzli9fTuPGjWnXrh1jx44lLS3NOa8hb6uMjAwAQkNDgar7u1u5cqXLOkra1OV/407eViU++ugjwsLC6Ny5MxMnTiQ3N9c5ryFuK5vNxpw5c8jJySEuLq7OfKfcflfwuu7IkSPYbDaXDxEgIiKCLVu2uKkq9+jTpw+zZ8+mXbt2JCcn8/jjjzNgwAA2bdpESkoKXl5eBAcHuywTERFBSkoKACkpKWVux5J59VXJeyvrvZ+4bRo3buwy38PDg9DQUJc2LVq0KLWOknkhISHVUn9NGzRoEFdddRUtWrRg586dPPzwwwwePJiVK1disVga7Lay2+3cc8899O/fn86dOwNU2d/dqdpkZmaSl5eHj49PdbylalPWtgK44YYbaN68OdHR0fz55588+OCDbN26lblz5wINa1tt3LiRuLg4jh8/jr+/P19++SUdO3Zkw4YNdeI7pXAjVWbw4MHO8a5du9KnTx+aN2/OZ599Vmf+oKX2u+6665zjXbp0oWvXrrRq1Yrly5dz0UUXubEy9xo3bhybNm3il19+cXcptd6pttWYMWOc4126dCEqKoqLLrqInTt30qpVq5ou063atWvHhg0byMjI4IsvvmDUqFH8+OOP7i6rwnRY6iyFhYVhsVhK9RRPTU0lMjLSTVXVDsHBwbRt25YdO3YQGRlJQUEB6enpLm1O3E6RkZFlbseSefVVyXsr7zsUGRnJoUOHXOYXFRVx9OjRBr/9WrZsSVhYGDt27AAa5ra68847+eabb1i2bBlNmzZ1Tq+qv7tTtQkMDKxz/3E51bYqS58+fQBcvlsNZVt5eXnRunVrevXqxdSpU+nWrRuvvPJKnflOKdycJS8vL3r16sXSpUud0+x2O0uXLiUuLs6NlblfdnY2O3fuJCoqil69euHp6emynbZu3UpSUpJzO8XFxbFx40aXH6bFixcTGBhIx44da7z+mtKiRQsiIyNdtk1mZia///67y7ZJT09n7dq1zjY//PADdrvd+Q9wXFwcP/30E4WFhc42ixcvpl27dnXyMEtF7d+/n7S0NKKiooCGta0Mw+DOO+/kyy+/5Icffih1qK2q/u7i4uJc1lHSpi79G3e6bVWWDRs2ALh8txrCtiqL3W4nPz+/7nynqqRbcgM3Z84cw2q1GrNnzzY2b95sjBkzxggODnbpKd4Q3Hfffcby5cuN3bt3GytWrDDi4+ONsLAw49ChQ4ZhOE4fbNasmfHDDz8Ya9asMeLi4oy4uDjn8iWnD15yySXGhg0bjO+++84IDw+vF6eCZ2VlGevXrzfWr19vAMaLL75orF+/3ti7d69hGI5TwYODg42vvvrK+PPPP40rrriizFPBe/ToYfz+++/GL7/8YrRp08bl9Ob09HQjIiLCuPHGG41NmzYZc+bMMXx9fevc6c3lbausrCxjwoQJxsqVK43du3cbS5YsMXr27Gm0adPGOH78uHMdDWVbjR071ggKCjKWL1/ucvpybm6us01V/N2VnLZ7//33G4mJicZrr71W505vPt222rFjh/HEE08Ya9asMXbv3m189dVXRsuWLY3zzz/fuY6Gsq0eeugh48cffzR2795t/Pnnn8ZDDz1kmEwmY9GiRYZh1I3vlMJNFXn11VeNZs2aGV5eXkbv3r2N3377zd0l1bjhw4cbUVFRhpeXl9GkSRNj+PDhxo4dO5zz8/LyjDvuuMMICQkxfH19jSuvvNJITk52WceePXuMwYMHGz4+PkZYWJhx3333GYWFhTX9VqrcsmXLDKDUMGrUKMMwHKeDP/bYY0ZERIRhtVqNiy66yNi6davLOtLS0ozrr7/e8Pf3NwIDA42bbrrJyMrKcmnzxx9/GOedd55htVqNJk2aGM8++2xNvcUqU962ys3NNS655BIjPDzc8PT0NJo3b27cdtttpf4j0VC2VVnbCTBmzZrlbFNVf3fLli0zunfvbnh5eRktW7Z0eY264HTbKikpyTj//PON0NBQw2q1Gq1btzbuv/9+l+vcGEbD2FY333yz0bx5c8PLy8sIDw83LrroImewMYy68Z0yGYZhVM0+IBERERH3U58bERERqVcUbkRERKReUbgRERGRekXhRkREROoVhRsRERGpVxRuREREpF5RuBEREZF6ReFGRBokk8nEvHnz3F2GiFQDhRsRqXGjR4/GZDKVGgYNGuTu0kSkHvBwdwEi0jANGjSIWbNmuUyzWq1uqkZE6hPtuRERt7BarURGRroMJXfkNplMzJw5k8GDB+Pj40PLli354osvXJbfuHEj//jHP/Dx8aFRo0aMGTOG7OxslzbvvfcenTp1wmq1EhUVxZ133uky/8iRI1x55ZX4+vrSpk0b5s+f75x37NgxRowYQXh4OD4+PrRp06ZUGBOR2knhRkRqpccee4xhw4bxxx9/MGLECK677joSExMByMnJISEhgZCQEFavXs3nn3/OkiVLXMLLzJkzGTduHGPGjGHjxo3Mnz+f1q1bu7zG448/zrXXXsuff/7JpZdeyogRIzh69Kjz9Tdv3szChQtJTExk5syZhIWF1dwGEJHKq7JbcIqIVNCoUaMMi8Vi+Pn5uQxPP/20YRiOOzjffvvtLsv06dPHGDt2rGEYhvHWW28ZISEhRnZ2tnP+t99+a5jNZucdwqOjo41HHnnklDUAxqOPPup8np2dbQDGwoULDcMwjCFDhhg33XRT1bxhEalR6nMjIm5x4YUXMnPmTJdpoaGhzvG4uDiXeXFxcWzYsAGAxMREunXrhp+fn3N+//79sdvtbN26FZPJxMGDB7nooovKraFr167OcT8/PwIDAzl06BAAY8eOZdiwYaxbt45LLrmEoUOH0q9fv0q9VxGpWQo3IuIWfn5+pQ4TVRUfH58KtfP09HR5bjKZsNvtAAwePJi9e/eyYMECFi9ezEUXXcS4ceOYNm1aldcrIlVLfW5EpFb67bffSj3v0KEDAB06dOCPP/4gJyfHOX/FihWYzWbatWtHQEAAsbGxLF269KxqCA8PZ9SoUfzvf//j5Zdf5q233jqr9YlIzdCeGxFxi/z8fFJSUlymeXh4ODvtfv7555xzzjmcd955fPTRR6xatYp3330XgBEjRjB58mRGjRrFlClTOHz4MHfddRc33ngjERERAEyZMoXbb7+dxo0bM3jwYLKyslixYgV33XVXheqbNGkSvXr1olOnTuTn5/PNN984w5WI1G4KNyLiFt999x1RUVEu09q1a8eWLVsAx5lMc+bM4Y477iAqKopPPvmEjh07AuDr68v333/Pf/7zH84991x8fX0ZNmwYL774onNdo0aN4vjx47z00ktMmDCBsLAwrr766grX5+XlxcSJE9mzZw8+Pj4MGDCAOXPmVME7F5HqZjIMw3B3ESIiJzKZTHz55ZcMHTrU3aWISB2kPjciIiJSryjciIiISL2iPjciUuvoaLmInA3tuREREZF6ReFGRERE6hWFGxEREalXFG5ERESkXlG4ERERkXpF4UZERETqFYUbERERqVcUbkRERKReUbgRERGReuX/AaXHoUUIUV0ZAAAAAElFTkSuQmCC", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "model = ANN()\n", "\n", "# Define loss function and optimizer\n", "criterion = nn.CrossEntropyLoss() # NOTE: We select a classification loss\n", "optimizer = optim.Adam(model.parameters(), lr=0.001)\n", "\n", "# Training the model\n", "epochs = 3000\n", "train_losses = []\n", "test_losses = []\n", "for epoch in range(epochs):\n", " optimizer.zero_grad()\n", " outputs = model(X_train)\n", " loss = criterion(outputs, y_train)\n", " loss.backward()\n", " optimizer.step()\n", " train_losses.append(loss.item())\n", "\n", " # Evaluation step on testing set\n", " with torch.no_grad():\n", " test_outputs = model(X_test)\n", " test_loss = criterion(test_outputs, y_test)\n", " test_losses.append(test_loss.item())\n", "\n", " print(f'Epoch {epoch+1}/{epochs}, Training Loss: {loss.item()}, Test Loss: {test_loss.item()}')\n", "\n", "# Plotting the training and testing losses over epochs\n", "plt.plot(range(epochs), train_losses, label='Training Loss')\n", "plt.plot(range(epochs), test_losses, label='Testing Loss')\n", "plt.xlabel('Epochs')\n", "plt.ylabel('Loss')\n", "plt.title('Training and Testing Loss Over Epochs')\n", "plt.legend()\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We appear to have achieved a cross-entropy loss of roughly $0.06$. Is that good? This can be hard to determine! Other evaluation metrics are often used to evaluate the performance of ML models for classification. Below we discuss several common metrics." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1. Accuracy\n", "\n", "The accuracy is the proportion of correct predictions to the total number of predictions:\n", "$$\n", "\\operatorname{accuracy}=\\frac{\\text{number of correct predictions}}{\\text{total number of predictions}}.\n", "$$\n", "\n", "While relatively simple (high accuracy means a high ratio of correct predictions), accuracy can be misleading if the class distribution is imbalanced. For example, in our earlier meteorite-prediction example, an accuracy of $0.999$ might at first appear to be quite good, but might actually be the accuracy of always prediction \"not a meteorite\"." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Accuracy on the test set: 96.00%\n" ] } ], "source": [ "# Switch model to evaluation mode\n", "model.eval()\n", "\n", "# Calculate the number of correct predictions\n", "with torch.no_grad():\n", " outputs = model(X_test)\n", " _, predicted = torch.max(outputs.data, 1)\n", " total = y_test.size(0)\n", " correct = (predicted == y_test).sum().item()\n", "\n", "# Calculate accuracy\n", "accuracy = 100 * correct / total\n", "print(f'Accuracy on the test set: {accuracy:.2f}%')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To determine how good this accuracy is, we can compute the probability of each label in the testing set." ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Empirical probabilities of labels in the test set:\n", "Label 0: 0.39\n", "Label 1: 0.31\n", "Label 2: 0.31\n" ] } ], "source": [ "import numpy as np\n", "\n", "# Count the occurrences of each label and calculate probabilities\n", "unique, counts = np.unique(y_test.numpy(), return_counts=True)\n", "probabilities = counts / counts.sum()\n", "\n", "# Printing the probabilities\n", "print(\"Empirical probabilities of labels in the test set:\")\n", "for label, prob in zip(unique, probabilities):\n", " print(f\"Label {label}: {prob:.2f}\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "That looks pretty good!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 2. Confusion Matrix\n", "\n", "While accuracy tells us how good the model is at making the correct predictions, it doesn't tell us how often it confuses different incorrect labels. For some problems some errors are more severe than others, and so this may be something we want to know. The conflusion matrix is a table that shows the probability that the model makes each possible (mis-)prediction.\n", "\n", "That is, the confusion matrix is a matrix with one row per class (possible label) and one column per class. The $(i,j)^\\text{th}$ entry holds the probability that a row with actual class $i$ will be classified as class $j$. In some cases, rather than computing the probability, the confusion matrix shows the number of points in the test set that fall into each category. Below is the code to compute the comfusion matrix for our model." ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAoAAAAIjCAYAAACTRapjAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy81sbWrAAAACXBIWXMAAA9hAAAPYQGoP6dpAABXuklEQVR4nO3dd3gU1fv38c8GyCakE1pCCSUQelNRQIrSlSYoVQVEUAELkSJKC4pRVEARQVCKCFYQvqJSpEsXCE1ESmhSDRB6EpLz/MHD/lwSIAsJGzPvF9dcV/bM7Jl71jXc3OfMGZsxxggAAACW4eHuAAAAAHB3kQACAABYDAkgAACAxZAAAgAAWAwJIAAAgMWQAAIAAFgMCSAAAIDFkAACAABYDAkgAACAxZAAArip3bt3q1GjRgoICJDNZtOcOXMytP/9+/fLZrNp6tSpGdrvf1m9evVUr149d4cBIBsjAQT+A/bu3avnnntOJUqUkJeXl/z9/VWrVi19+OGHunTpUqaeu3Pnztq2bZtGjBih6dOn6957783U891NXbp0kc1mk7+/f5qf4+7du2Wz2WSz2fT++++73P+RI0c0bNgwxcTEZEC0AJBxcro7AAA399NPP+mJJ56Q3W7X008/rQoVKigxMVG//fab+vXrpx07dmjixImZcu5Lly5pzZo1euONN9S7d+9MOUdYWJguXbqkXLlyZUr/t5IzZ05dvHhRP/74o9q2beu0b8aMGfLy8tLly5dvq+8jR44oKipKxYoVU5UqVdL9voULF97W+QAgvUgAgSwsNjZW7du3V1hYmJYsWaKQkBDHvl69emnPnj366aefMu38J0+elCQFBgZm2jlsNpu8vLwyrf9bsdvtqlWrlr766qtUCeDMmTP16KOPatasWXcllosXLyp37tzy9PS8K+cDYF0MAQNZ2MiRI3X+/Hl9/vnnTsnfNeHh4Xr55Zcdr69cuaI333xTJUuWlN1uV7FixfT6668rISHB6X3FihVTs2bN9Ntvv6l69ery8vJSiRIl9MUXXziOGTZsmMLCwiRJ/fr1k81mU7FixSRdHTq99vO/DRs2TDabzalt0aJFevDBBxUYGChfX19FRETo9ddfd+y/0RzAJUuWqHbt2vLx8VFgYKBatmypnTt3pnm+PXv2qEuXLgoMDFRAQIC6du2qixcv3viDvU7Hjh31yy+/6MyZM462DRs2aPfu3erYsWOq40+dOqW+ffuqYsWK8vX1lb+/v5o2baotW7Y4jlm2bJnuu+8+SVLXrl0dQ8nXrrNevXqqUKGCNm7cqDp16ih37tyOz+X6OYCdO3eWl5dXqutv3LixgoKCdOTIkXRfKwBIJIBAlvbjjz+qRIkSqlmzZrqOf/bZZzVkyBBVq1ZNo0ePVt26dRUdHa327dunOnbPnj16/PHH1bBhQ33wwQcKCgpSly5dtGPHDklS69atNXr0aElShw4dNH36dI0ZM8al+Hfs2KFmzZopISFBw4cP1wcffKAWLVpo1apVN33fr7/+qsaNG+vEiRMaNmyYIiMjtXr1atWqVUv79+9PdXzbtm117tw5RUdHq23btpo6daqioqLSHWfr1q1ls9k0e/ZsR9vMmTNVpkwZVatWLdXx+/bt05w5c9SsWTONGjVK/fr107Zt21S3bl1HMla2bFkNHz5cktSjRw9Nnz5d06dPV506dRz9xMXFqWnTpqpSpYrGjBmjhx56KM34PvzwQ+XLl0+dO3dWcnKyJOnTTz/VwoULNXbsWIWGhqb7WgFAkmQAZEnx8fFGkmnZsmW6jo+JiTGSzLPPPuvU3rdvXyPJLFmyxNEWFhZmJJkVK1Y42k6cOGHsdrt59dVXHW2xsbFGknnvvfec+uzcubMJCwtLFcPQoUPNv3+tjB492kgyJ0+evGHc184xZcoUR1uVKlVM/vz5TVxcnKNty5YtxsPDwzz99NOpzvfMM8849fnYY4+Z4ODgG57z39fh4+NjjDHm8ccfN/Xr1zfGGJOcnGwKFixooqKi0vwMLl++bJKTk1Ndh91uN8OHD3e0bdiwIdW1XVO3bl0jyUyYMCHNfXXr1nVqW7BggZFk3nrrLbNv3z7j6+trWrVqdctrBIC0UAEEsqizZ89Kkvz8/NJ1/M8//yxJioyMdGp/9dVXJSnVXMFy5cqpdu3ajtf58uVTRESE9u3bd9sxX+/a3MG5c+cqJSUlXe85evSoYmJi1KVLF+XJk8fRXqlSJTVs2NBxnf/2/PPPO72uXbu24uLiHJ9henTs2FHLli3TsWPHtGTJEh07dizN4V/p6rxBD4+rvz6Tk5MVFxfnGN7etGlTus9pt9vVtWvXdB3bqFEjPffccxo+fLhat24tLy8vffrpp+k+FwD8GwkgkEX5+/tLks6dO5eu4w8cOCAPDw+Fh4c7tRcsWFCBgYE6cOCAU3vRokVT9REUFKTTp0/fZsSptWvXTrVq1dKzzz6rAgUKqH379vr2229vmgxeizMiIiLVvrJly+qff/7RhQsXnNqvv5agoCBJculaHnnkEfn5+embb77RjBkzdN9996X6LK9JSUnR6NGjVapUKdntduXNm1f58uXT1q1bFR8fn+5zFipUyKUbPt5//33lyZNHMTEx+uijj5Q/f/50vxcA/o0EEMii/P39FRoaqu3bt7v0vutvwriRHDlypNlujLntc1ybn3aNt7e3VqxYoV9//VVPPfWUtm7dqnbt2qlhw4apjr0Td3It19jtdrVu3VrTpk3TDz/8cMPqnyS9/fbbioyMVJ06dfTll19qwYIFWrRokcqXL5/uSqd09fNxxebNm3XixAlJ0rZt21x6LwD8GwkgkIU1a9ZMe/fu1Zo1a255bFhYmFJSUrR7926n9uPHj+vMmTOOO3ozQlBQkNMds9dcX2WUJA8PD9WvX1+jRo3SH3/8oREjRmjJkiVaunRpmn1fi3PXrl2p9v3555/KmzevfHx87uwCbqBjx47avHmzzp07l+aNM9d8//33euihh/T555+rffv2atSokRo0aJDqM0lvMp4eFy5cUNeuXVWuXDn16NFDI0eO1IYNGzKsfwDWQgIIZGH9+/eXj4+Pnn32WR0/fjzV/r179+rDDz+UdHUIU1KqO3VHjRolSXr00UczLK6SJUsqPj5eW7dudbQdPXpUP/zwg9Nxp06dSvXeawsiX780zTUhISGqUqWKpk2b5pRQbd++XQsXLnRcZ2Z46KGH9Oabb+rjjz9WwYIFb3hcjhw5UlUXv/vuO/39999ObdcS1bSSZVcNGDBABw8e1LRp0zRq1CgVK1ZMnTt3vuHnCAA3w0LQQBZWsmRJzZw5U+3atVPZsmWdngSyevVqfffdd+rSpYskqXLlyurcubMmTpyoM2fOqG7dulq/fr2mTZumVq1a3XCJkdvRvn17DRgwQI899pheeuklXbx4UePHj1fp0qWdboIYPny4VqxYoUcffVRhYWE6ceKEPvnkExUuXFgPPvjgDft/77331LRpU9WoUUPdunXTpUuXNHbsWAUEBGjYsGEZdh3X8/Dw0KBBg255XLNmzTR8+HB17dpVNWvW1LZt2zRjxgyVKFHC6biSJUsqMDBQEyZMkJ+fn3x8fHT//ferePHiLsW1ZMkSffLJJxo6dKhjWZopU6aoXr16Gjx4sEaOHOlSfwDAMjDAf8Bff/1lunfvbooVK2Y8PT2Nn5+fqVWrlhk7dqy5fPmy47ikpCQTFRVlihcvbnLlymWKFCliBg4c6HSMMVeXgXn00UdTnef65UdutAyMMcYsXLjQVKhQwXh6epqIiAjz5ZdfploGZvHixaZly5YmNDTUeHp6mtDQUNOhQwfz119/pTrH9Uul/Prrr6ZWrVrG29vb+Pv7m+bNm5s//vjD6Zhr57t+mZkpU6YYSSY2NvaGn6kxzsvA3MiNloF59dVXTUhIiPH29ja1atUya9asSXP5lrlz55py5cqZnDlzOl1n3bp1Tfny5dM857/7OXv2rAkLCzPVqlUzSUlJTsf16dPHeHh4mDVr1tz0GgDgejZjXJglDQAAgP885gACAABYDAkgAACAxZAAAgAAWAwJIAAAgMWQAAIAAFgMCSAAAIDFkAACAABYTLZ8Eoh31d7uDgFI5fSGj90dAgBkaV5uzEoyM3e4tDnr/f6nAggAAGAx2bICCAAA4BKbtWpiJIAAAAA2m7sjuKusle4CAACACiAAAIDVhoCtdbUAAACgAggAAMAcQAAAAGRrVAABAACYAwgAAIDsjAogAACAxeYAkgACAAAwBAwAAIDsjAogAACAxYaAqQACAABYDBVAAAAA5gACAAAgO6MCCAAAwBxAAAAAZGdUAAEAACw2B5AEEAAAgCFgAAAAZGdUAAEAACw2BGytqwUAAAAVQAAAACqAAAAAyNaoAAIAAHhwFzAAAACyMSqAAAAAFpsDSAIIAADAQtAAAADIzqgAAgAAWGwI2FpXCwAAACqAAAAAzAEEAABAtkYFEAAAgDmAAAAAyM6oAAIAAFhsDiAJIAAAAEPAAAAAyM6oAAIAAFhsCJgKIAAAgMVQAQQAAGAOIAAAALIzKoAAAADMAQQAAEB2RgUQAADAYnMASQABAAAslgBa62oBAABABRAAAICbQAAAAJCtUQEEAABgDiAAAACyMyqAAAAAzAEEAABAdkYFEAAAwGJzALNUAnj58mUlJiY6tfn7+7spGgAAYBkMAd9dFy9eVO/evZU/f375+PgoKCjIaQMAAEDGcnsC2K9fPy1ZskTjx4+X3W7XZ599pqioKIWGhuqLL75wd3gAAMACbDZbpm1ZkdsTwB9//FGffPKJ2rRpo5w5c6p27doaNGiQ3n77bc2YMcPd4QEAANw10dHRuu++++Tn56f8+fOrVatW2rVrl9Mx9erVS5VkPv/88y6dx+0J4KlTp1SiRAlJV+f7nTp1SpL04IMPasWKFe4MDQAAWERWqQAuX75cvXr10tq1a7Vo0SIlJSWpUaNGunDhgtNx3bt319GjRx3byJEjXTqP228CKVGihGJjY1W0aFGVKVNG3377rapXr64ff/xRgYGB7g4PAADgjiQkJCghIcGpzW63y263pzp2/vz5Tq+nTp2q/Pnza+PGjapTp46jPXfu3CpYsOBtx+T2CmDXrl21ZcsWSdJrr72mcePGycvLS3369FG/fv3cHB0AALAEW+Zt0dHRCggIcNqio6PTFVZ8fLwkKU+ePE7tM2bMUN68eVWhQgUNHDhQFy9edO1yjTHGpXdksgMHDmjjxo0KDw9XpUqVbqsP76q9Mzgq4M6d3vCxu0MAgCzNy43jkj5PTMm0vk992THdFcB/S0lJUYsWLXTmzBn99ttvjvaJEycqLCxMoaGh2rp1qwYMGKDq1atr9uzZ6Y7J7UPA1wsLC1NAQADDvwAA4K7JzLt105PspaVXr17avn27U/InST169HD8XLFiRYWEhKh+/frau3evSpYsma6+3T4E/O677+qbb75xvG7btq2Cg4NVqFAhx9AwAABAZsoqN4Fc07t3b82bN09Lly5V4cKFb3rs/fffL0nas2dPuvt3ewI4YcIEFSlSRJK0aNEiLVq0SL/88ouaNm3KHEAAAGApxhj17t1bP/zwg5YsWaLixYvf8j0xMTGSpJCQkHSfx+1DwMeOHXMkgPPmzVPbtm3VqFEjFStWzJHRAgAAZKassmBzr169NHPmTM2dO1d+fn46duyYJCkgIEDe3t7au3evZs6cqUceeUTBwcHaunWr+vTpozp16rh074TbK4BBQUE6dOiQpKu3Pjdo0EDS1Qw4OTnZnaEBAADcVePHj1d8fLzq1aunkJAQx3Ztupynp6d+/fVXNWrUSGXKlNGrr76qNm3a6Mcff3TpPG6vALZu3VodO3ZUqVKlFBcXp6ZNm0qSNm/erPDwcDdHBwAArCCrVABvtThLkSJFtHz58js+j9srgKNHj1bv3r1Vrlw5LVq0SL6+vpKko0ePqmfPnm6OLvvr+0wj/fZlP5347X0dWBytb0d1V6mw/E7HFC+cV9980F0Hl0Tr+Mr39OW7zyh/Hj83RQwr+3rmDDVt+LDuq1pRndo/oW1bt7o7JFgc30n8V2W5dQAzAusApt/cj3vquwUbtXHHAeXMmUNRvZurfHioqrZ+SxcvJyq3l6c2fDtQ2/76W29O+FmSNLTnowrJF6A6T39wy3+p4P+wDuCdmf/Lzxo0sL8GDY1SxYqVNWP6NC1cOF9z581XcHCwu8ODBfGdzHjuXAcwoOP0TOs7fuZTmdb37XJ7BVCS9u7dqxdffFENGjRQgwYN9NJLL2nfvn3uDssSWvb+RF/+uE479x3Ttr/+Vo+hX6poSB5VLXf1xpwaVUooLDRY3Yd+qR17jmjHniN6dsh0VStXVPWql3Zz9LCS6dOmqPXjbdXqsTYqGR6uQUOj5OXlpTmzZ7k7NFgU30n8l7k9AVywYIHKlSun9evXq1KlSqpUqZLWrVvnGBLG3eXv6yVJOh1/9ZEyds+cMsYoIfGK45jLCVeUkmJUs0r6FpsE7lRSYqJ2/rFDD9So6Wjz8PDQAw/U1NYtm90YGayK72T2k9XWAcxsbr8J5LXXXlOfPn30zjvvpGofMGCAGjZs6KbIrMdms+m9vo9r9ea9+mPvUUnS+m37deFSoka83FJDPv6fbLLprZdbKmfOHCqY19/NEcMqTp85reTk5FTDasHBwYqNZbQAdx/fSfzXub0CuHPnTnXr1i1V+zPPPKM//vjjlu9PSEjQ2bNnnTaTwvIxt2PMwLYqHx6ip1/7v+ch/nP6vDr1/1yP1Kmgf1Z9oOMr31OAr7c2/XFQKcz/AwBkE1QA77J8+fIpJiZGpUqVcmqPiYlR/vz5b/Cu/xMdHa2oqCinthwF7lOukOoZGmd2N3rAE3qkdgU16DZGf58447Rv8do/Vb5FlIIDfXTlSoriz19S7KK3tX/BRvcEC8sJCgxSjhw5FBcX59QeFxenvHnzuikqWBnfyewnqyZqmcXtFcDu3burR48eevfdd7Vy5UqtXLlS77zzjp577jl17979lu8fOHCg4uPjnbacBe65C5FnH6MHPKEWD1dWk+c+0oEjcTc8Lu7MBcWfv6S695VW/jy+mrd8212MElaWy9NTZcuV17q1axxtKSkpWrdujSpVrurGyGBVfCfxX+f2CuDgwYPl5+enDz74QAMHDpQkhYaGatiwYXrppZdu+X673S673e7UZvPIkSmxZkdjBrZVu6b36ok+E3X+wmUVCL66vl/8+cu6nJAkSXqqxQPaFXtMJ0+f1/2Viuv9fo9r7Iyl2n3ghDtDh8U81bmrBr8+QOXLV1CFipX05fRpunTpklo91trdocGi+E5mL1arALo9AbTZbOrTp4/69Omjc+fOSZL8/Fhk+G55rm0dSdKiz15xau8+ZLq+/HGdJKl0sfwa/mIL5QnIrQNHTmnk5wv00ZdL7naosLgmTR/R6VOn9MnHH+mff04qokxZffLpZwpmuA1uwncS/2VuXwj64Ycf1uzZsxUYGOjUfvbsWbVq1UpLlrieaLAQNLIiFoIGgJtz50LQwZ2/yrS+46Z1yLS+b5fb5wAuW7ZMiYmJqdovX76slStXuiEiAACA7M1tufbWfz0v8Y8//tCxY8ccr5OTkzV//nwVKlTIHaEBAACLYQ7gXVKlShXH+jgPP/xwqv3e3t4aO3asGyIDAADI3tyWAMbGxsoYoxIlSmj9+vXKly+fY5+np6fy58+vHDm4mxcAAGQ+KoB3SVhYmKSr6yYBAAC4k9USQLffBCJJ06dPV61atRQaGqoDBw5IkkaPHq25c+e6OTIAAIDsx+0J4Pjx4xUZGalHHnlEZ86cUXLy1ef4BgUFacyYMe4NDgAAWIMtE7csyO0J4NixYzVp0iS98cYbTnP+7r33Xm3bxqPGAAAAMprbnwQSGxurqlVTPzfRbrfrwoULbogIAABYDXMA77LixYsrJiYmVfv8+fNVtmzZux8QAABANuf2CmBkZKR69eqly5cvyxij9evX66uvvlJ0dLQ+++wzd4cHAAAswGoVQLcngM8++6y8vb01aNAgXbx4UR07dlShQoX04Ycfqn379u4ODwAAINtxewJ46dIlPfbYY+rUqZMuXryo7du3a9WqVSpcuLC7QwMAABZhtQqg2+cAtmzZUl988YUkKTExUS1atNCoUaPUqlUrjR8/3s3RAQAAK7j2eNrM2LIityeAmzZtUu3atSVJ33//vQoUKKADBw7oiy++0EcffeTm6AAAALIftw8BX7x4UX5+fpKkhQsXqnXr1vLw8NADDzzgeCoIAABApsqahbpM4/YKYHh4uObMmaNDhw5pwYIFatSokSTpxIkT8vf3d3N0AAAA2Y/bE8AhQ4aob9++KlasmO6//37VqFFD0tVqYFoLRAMAAGQ0q80BdPsQ8OOPP64HH3xQR48eVeXKlR3t9evX12OPPebGyAAAALIntyeAklSwYEEVLFjQqa169epuigYAAFhNVq3UZRa3DwEDAADg7soSFUAAAAB3sloFkAQQAADAWvkfQ8AAAABWQwUQAABYntWGgKkAAgAAWAwVQAAAYHlUAAEAAJCtUQEEAACWRwUQAAAA2RoVQAAAYHlWqwCSAAIAAFgr/2MIGAAAwGqoAAIAAMuz2hAwFUAAAACLoQIIAAAsjwogAAAAsjUqgAAAwPIsVgCkAggAAGA1VAABAIDlWW0OIAkgAACwPIvlfwwBAwAAWA0VQAAAYHlWGwKmAggAAGAxVAABAIDlWawASAUQAADAaqgAAgAAy/PwsFYJkAogAACAxVABBAAAlme1OYAkgAAAwPJYBgYAAADZGhVAAABgeRYrAFIBBAAAsBoqgAAAwPKYAwgAAIBsjQogAACwPCqAAAAAyNZIAAEAgOXZbJm3uSI6Olr33Xef/Pz8lD9/frVq1Uq7du1yOuby5cvq1auXgoOD5evrqzZt2uj48eMunYcEEAAAWJ7NZsu0zRXLly9Xr169tHbtWi1atEhJSUlq1KiRLly44DimT58++vHHH/Xdd99p+fLlOnLkiFq3bu3SeZgDCAAAkEXMnz/f6fXUqVOVP39+bdy4UXXq1FF8fLw+//xzzZw5Uw8//LAkacqUKSpbtqzWrl2rBx54IF3nIQEEAACWl5n3gCQkJCghIcGpzW63y2633/K98fHxkqQ8efJIkjZu3KikpCQ1aNDAcUyZMmVUtGhRrVmzJt0JIEPAAAAAmSg6OloBAQFOW3R09C3fl5KSoldeeUW1atVShQoVJEnHjh2Tp6enAgMDnY4tUKCAjh07lu6YqAACAADLy8xlYAYOHKjIyEintvRU/3r16qXt27frt99+y/CYSAABAAAyUXqHe/+td+/emjdvnlasWKHChQs72gsWLKjExESdOXPGqQp4/PhxFSxYMN39MwQMAAAsL6ssA2OMUe/evfXDDz9oyZIlKl68uNP+e+65R7ly5dLixYsdbbt27dLBgwdVo0aNdJ+HCiAAAEAW0atXL82cOVNz586Vn5+fY15fQECAvL29FRAQoG7duikyMlJ58uSRv7+/XnzxRdWoUSPdN4BIJIAAAABZ5lFw48ePlyTVq1fPqX3KlCnq0qWLJGn06NHy8PBQmzZtlJCQoMaNG+uTTz5x6TwkgAAAAFmEMeaWx3h5eWncuHEaN27cbZ+HBBAAAFheFikA3jUkgAAAwPKyyhDw3cJdwAAAABZDBRAAAFiexQqA2TMBPL3hY3eHAKRSvOcsd4cAOIn9pI27QwDgJtkyAQQAAHAFcwABAACQrVEBBAAAlmexAiAVQAAAAKuhAggAACzPanMASQABAIDlWSz/YwgYAADAaqgAAgAAy7PaEDAVQAAAAIuhAggAACyPCiAAAACyNSqAAADA8ixWAKQCCAAAYDVUAAEAgOVZbQ4gCSAAALA8i+V/DAEDAABYDRVAAABgeVYbAqYCCAAAYDFUAAEAgOVZrABIBRAAAMBqqAACAADL87BYCZAKIAAAgMVQAQQAAJZnsQIgCSAAAADLwAAAACBbowIIAAAsz8NaBUAqgAAAAFZDBRAAAFgecwABAACQrVEBBAAAlmexAiAVQAAAAKuhAggAACzPJmuVAEkAAQCA5bEMDAAAALI1KoAAAMDyWAYGAAAA2RoVQAAAYHkWKwBSAQQAALAaKoAAAMDyPCxWAnS5Ajht2jT99NNPjtf9+/dXYGCgatasqQMHDmRocAAAAMh4LieAb7/9try9vSVJa9as0bhx4zRy5EjlzZtXffr0yfAAAQAAMpvNlnlbVuTyEPChQ4cUHh4uSZozZ47atGmjHj16qFatWqpXr15GxwcAAJDpWAbmFnx9fRUXFydJWrhwoRo2bChJ8vLy0qVLlzI2OgAAAGQ4lyuADRs21LPPPquqVavqr7/+0iOPPCJJ2rFjh4oVK5bR8QEAAGQ6ixUAXa8Ajhs3TjVq1NDJkyc1a9YsBQcHS5I2btyoDh06uNRXUlKS6tevr927d7saBgAAAG6TyxXAwMBAffzxx6nao6KiXD55rly5tHXrVpffBwAAkJGstgxMuhJAV5K0SpUquRTAk08+qc8//1zvvPOOS+8DAADA7UlXAlilShXZbDYZY9Lcf22fzWZTcnKySwFcuXJFkydP1q+//qp77rlHPj4+TvtHjRrlUn8AAACuslb9L50JYGxsbKYFsH37dlWrVk2S9Ndffznts9ot2QAAAHdDuhLAsLCwTAtg6dKlmdY3AABAelit6OTyXcCSNH36dNWqVUuhoaGOx7+NGTNGc+fOvaNgDh8+rMOHD99RHwAAAK7ysGXelhW5nACOHz9ekZGReuSRR3TmzBnHnL/AwECNGTPG5QBSUlI0fPhwBQQEKCwsTGFhYQoMDNSbb76plJQUl/sDAADAzbmcAI4dO1aTJk3SG2+8oRw5cjja7733Xm3bts3lAN544w19/PHHeuedd7R582Zt3rxZb7/9tsaOHavBgwe73B8AAICrbDZbpm1ZkcvrAMbGxqpq1aqp2u12uy5cuOByANOmTdNnn32mFi1aONoqVaqkQoUKqWfPnhoxYoTLfQIAAODGXK4AFi9eXDExMana58+fr7Jly7ocwKlTp1SmTJlU7WXKlNGpU6dc7g8AAMBVNlvmbVmRyxXAyMhI9erVS5cvX5YxRuvXr9dXX32l6OhoffbZZy4HULlyZX388cf66KOPnNo//vhjVa5c2eX+AAAAcHMuJ4DPPvusvL29NWjQIF28eFEdO3ZUaGioPvzwQ7Vv397lAEaOHKlHH31Uv/76q2rUqCFJWrNmjQ4dOqSff/7Z5f4AAABclVXn6mWW21oGplOnTtq9e7fOnz+vY8eO6fDhw+rWrdttBVC3bl399ddfeuyxx3TmzBmdOXNGrVu31q5du1S7du3b6hMAAAA35nIF8JoTJ05o165dkq5mzfny5bvtIEJDQ7nZAwAAuE1WXa8vs7icAJ47d049e/bUV1995VinL0eOHGrXrp3GjRungICAW/axdevWdJ+vUqVKroYIAADgEqsNAd/WHMDNmzfrp59+cpqz9/LLL+u5557T119/fcs+qlSpIpvNJmPMTY+z2WyOhaYBAACQMVxOAOfNm6cFCxbowQcfdLQ1btxYkyZNUpMmTdLVR2xsrKunBQAAyDTWqv/dRgIYHByc5jBvQECAgoKC0tVHWFiYq6cFAABABnH5LuBBgwYpMjJSx44dc7QdO3ZM/fr1u+1Ht+3du1cvvviiGjRooAYNGuill17S3r17b6svAAAAV3nYbJm2ZUXpSgCrVq2qatWqqVq1apowYYLWrl2rokWLKjw8XOHh4SpatKhWr16tTz/91OUAFixYoHLlymn9+vWqVKmSKlWqpHXr1ql8+fJatGiRy/0BAAD8l61YsULNmzdXaGiobDab5syZ47S/S5cuqZ43nN5peNekawi4VatWLnXqitdee019+vTRO++8k6p9wIABatiwYaadGwAAQMpaj2y7cOGCKleurGeeeUatW7dO85gmTZpoypQpjtd2u92lc6QrARw6dKhLnbpi586d+vbbb1O1P/PMMxozZkymnRcAACAratq0qZo2bXrTY+x2uwoWLHjb57itJ4FkpHz58ikmJiZVe0xMjPLnz3/3AwIAAJZz/ZBqRm4JCQk6e/as05aQkHBH8S5btkz58+dXRESEXnjhBcXFxbn0fpfvAk5OTtbo0aP17bff6uDBg0pMTHTaf+rUKZf66969u3r06KF9+/apZs2akqRVq1bp3XffVWRkpKvhAQAAZCnR0dGKiopyahs6dKiGDRt2W/01adJErVu3VvHixbV37169/vrratq0qdasWaMcOXKkqw+XE8CoqCh99tlnevXVVzVo0CC98cYb2r9/v+bMmaMhQ4a4fBGDBw+Wn5+fPvjgAw0cOFDS1UfDDRs2TC+99JLL/QEAALgqM+cADhw4MFVRy9U5e//Wvn17x88VK1ZUpUqVVLJkSS1btkz169dPVx8uJ4AzZszQpEmT9Oijj2rYsGHq0KGDSpYsqUqVKmnt2rUuJ202m019+vRRnz59dO7cOUmSn5+fq2Ehg309c4amTflc//xzUqUjyui11werIo/lw13yYpMIPVItVOEF/XQ5MVm/7zult2Zt097j5x3HjHyyqmqXza8CAd66mHBFG/bGacTs7dpz7JwbI4fV8Lsy+8jM5VrsdvsdJXy3UqJECeXNm1d79uxJdwLo8hzAY8eOqWLFipIkX19fxcfHS5KaNWumn376ydXuFBsbq927d0u6mvhdS/52796t/fv3u9wf7tz8X37W+yOj9VzPXvr6ux8UEVFGLzzXzeX5BcDtqlE6r6Ys3adHo5eq3ZjflDOHTV+/8qC8Pf9vaGPrgTPqM3Wj6gxdqA4f/iabTfr6lQct90B3uA+/K5FVHD58WHFxcQoJCUn3e1xOAAsXLqyjR49KkkqWLKmFCxdKkjZs2HBb2W2XLl20evXqVO3r1q1Tly5dXO4Pd276tClq/XhbtXqsjUqGh2vQ0Ch5eXlpzuxZ7g4NFtHxo1X6ds0B/XX0nP44HK9XpvyuwsE+qhz2f08b+nJlrNbu/keH4y5q28EzenfODhXKk1tF8vq4MXJYCb8rsxebLfM2V50/f14xMTGOm2RjY2MVExOjgwcP6vz58+rXr5/Wrl2r/fv3a/HixWrZsqXCw8PVuHHjdJ/D5QTwscce0+LFiyVJL774ogYPHqxSpUrp6aef1jPPPONqd9q8ebNq1aqVqv2BBx5I8+5gZK6kxETt/GOHHqhR09Hm4eGhBx6oqa1bNrsxMliZn3cuSdLpC4lp7vf2zKH2tYrpwMkLOnLq4t0MDRbF70pkpt9//11Vq1ZV1apVJUmRkZGqWrWqhgwZohw5cmjr1q1q0aKFSpcurW7duumee+7RypUrXSrEuTwH8N8LNrdr105hYWFavXq1SpUqpebNm7vanWw2m2Pu37/Fx8crOTnZ5f5wZ06fOa3k5GQFBwc7tQcHBys2dp+booKV2WzS8HaVtX7PP9p15KzTvs51S2hwm4ry8cqpPcfOqd2YlUpKNm6KFFbC78rsx5aFVoKuV6+ejLnx77IFCxbc8TnueB3ABx54QJGRkbr//vv19ttvu/z+OnXqKDo62inZS05OVnR0tB588MFbvj8z1tYBkHVEd6iqMqH+en7i+lT7Zq8/qIZvLdZj7y3X3uPnNLHH/bLndPvypgCQ5WXYb8qjR49q8ODBLr/v3Xff1ZIlSxQREaGuXbuqa9euioiI0IoVK/Tee+/d8v3R0dEKCAhw2t57N/p2LgGSggKDlCNHjlSTmOPi4pQ3b143RQWrGtGhihpUKqg2H6zQ0TOXUu0/d+mKYk+c19rd/6j7hLUKL+inplVD3RAprIbfldmPRyZuWZHb4ypXrpy2bt2qtm3b6sSJEzp37pyefvpp/fnnn6pQocIt3z9w4EDFx8c7bf0GDLwLkWdPuTw9VbZcea1bu8bRlpKSonXr1qhS5apujAxWM6JDFTWtEqonRq3Uobhbz+u7uuK+5JkzfYugAneC35X4r3N5DmBmCA0Nva3hYynttXUuX8mIqKzrqc5dNfj1ASpfvoIqVKykL6dP06VLl9TqsbQfSA1ktOiOVfRY9SLq+skanb+cpHz+V/8fP3cpSZeTUlQ0r49a3ltYy/84rrjzCQoJ9FbvphG6lJisxduPuTl6WAW/K7OXrDQH8G5wSwK4detWVahQQR4eHtq6detNj63Egpp3XZOmj+j0qVP65OOP9M8/JxVRpqw++fQzBTOsgbukS72SkqTZfes6tb885Xd9u+aAEpKSdX+pvOreIFwBuT118uxlrdv9j1q8u0xx55gDjLuD35XZi9XWELWZm91m8i+3ei7vyZMnNXPmzHTduevh4aFjx44pf/788vDwkM1mS/NuF5vNdlt3AlMBRFZUvCdrgyFrif2kjbtDAJx4uXFc8pW5f2Za32Nalsm0vm9Xuj/qzZtvva5RnTp10tVXbGys8uXL5/gZAADAnaxWAUx3Arh06dIMO2lYWFiaPwMAACDzuf0u4GnTpjk9Q7h///4KDAxUzZo1deDAATdGBgAArOLqSgKZs2VFbk8A3377bXl7e0uS1qxZo48//lgjR45U3rx51adPHzdHBwAAkP24fRmYQ4cOKTw8XJI0Z84cPf744+rRo4dq1aqlevXquTc4AABgCVabA+j2CqCvr69jJfWFCxeqYcOGkiQvLy9dupR65X8AAADcGbdXABs2bKhnn31WVatW1V9//aVHHnlEkrRjxw4VK1bMvcEBAABLyKJT9TLNbVUAV65cqSeffFI1atTQ33//LUmaPn26fvvtN5f7GjdunGrWrKmTJ09q1qxZCg4OliRt3LhRHTp0uJ3wAAAAXOJhs2XalhW5XAGcNWuWnnrqKXXq1EmbN29WQsLVVffj4+P19ttv6+eff053X1euXNFHH32kAQMGqHDhwk77oqKiXA0NAAAA6eByBfCtt97ShAkTNGnSJOXKlcvRXqtWLW3atMmlvnLmzKmRI0fqyhUe3QEAANzHIxO3rMjluHbt2pXmEz8CAgJ05swZlwOoX7++li9f7vL7AAAAcHtcHgIuWLCg9uzZk+oGjd9++00lSpRwOYCmTZvqtdde07Zt23TPPffIx8fHaX+LFi1c7hMAAMAVWXSqXqZxOQHs3r27Xn75ZU2ePFk2m01HjhzRmjVr1LdvXw0ePNjlAHr27ClJGjVqVKp9NptNycnJLvcJAACAG3M5AXzttdeUkpKi+vXr6+LFi6pTp47sdrv69u2rF1980eUAUlJSXH4PAABARsqqd+tmFpcTQJvNpjfeeEP9+vXTnj17dP78eZUrV06+vr53HMzly5fl5eV1x/0AAADgxm775hRPT0+VK1dO1atXv6PkLzk5WW+++aYKFSokX19f7du3T5I0ePBgff7557fdLwAAQHrZbJm3ZUUuVwAfeugh2W5yNUuWLHGpvxEjRmjatGkaOXKkunfv7mivUKGCxowZo27durkaIgAAgEus9ixglxPAKlWqOL1OSkpSTEyMtm/frs6dO7scwBdffKGJEyeqfv36ev755x3tlStX1p9//ulyfwAAALg5lxPA0aNHp9k+bNgwnT9/3uUA/v77b4WHh6dqT0lJUVJSksv9AQAAuMpqN4Fk2ALVTz75pCZPnuzy+8qVK6eVK1emav/+++9VtWrVjAgNAAAA/+JyBfBG1qxZc1t38A4ZMkSdO3fW33//rZSUFM2ePVu7du3SF198oXnz5mVUeAAAADdksQKg6wlg69atnV4bY3T06FH9/vvvt7UQdMuWLfXjjz9q+PDh8vHx0ZAhQ1StWjX9+OOPatiwocv9AQAA4OZcTgADAgKcXnt4eCgiIkLDhw9Xo0aNXA7g2Wef1ZNPPqlFixa5/F4AAICMwF3AN5GcnKyuXbuqYsWKCgoKypAATp48qSZNmihfvnzq0KGDOnXqpMqVK2dI3wAAAEjNpZtAcuTIoUaNGunMmTMZFsDcuXN19OhRDR48WOvXr1e1atVUvnx5vf3229q/f3+GnQcAAOBGbJn4Jyty+S7gChUqOJ7WkVGCgoLUo0cPLVu2TAcOHFCXLl00ffr0NJeHAQAAyGgetszbsiKXE8C33npLffv21bx583T06FGdPXvWabsTSUlJ+v3337Vu3Trt379fBQoUuKP+AAAAkFq6E8Dhw4frwoULeuSRR7Rlyxa1aNFChQsXVlBQkIKCghQYGHjb8wKXLl2q7t27q0CBAurSpYv8/f01b948HT58+Lb6AwAAcIXVKoDpvgkkKipKzz//vJYuXZqhARQqVEinTp1SkyZNNHHiRDVv3lx2uz1DzwEAAID/k+4E0BgjSapbt26GBjBs2DA98cQTCgwMzNB+AQAA0stmsZWgXVoGJjM+nO7du2d4nwAAALgxlxLA0qVL3zIJPHXq1B0FBAAAcLdl1bl6mcWlBDAqKirVk0AAAADw3+JSAti+fXvlz58/s2IBAABwC4tNAUx/Ami1yZEAAMA6PCyW56R7HcBrdwEDAADgvy3dFcCUlJTMjAMAAMBtrHYTiMuPggMAAMB/m0s3gQAAAGRHFpsCSAUQAADAaqgAAgAAy/OQtUqAVAABAAAshgogAACwPKvNASQBBAAAlscyMAAAAMjWqAACAADL41FwAAAAyNaoAAIAAMuzWAGQCiAAAIDVUAEEAACWxxxAAAAAZGtUAAEAgOVZrABIAggAAGC1IVGrXS8AAIDlUQEEAACWZ7PYGDAVQAAAAIuhAggAACzPWvU/KoAAAACWQwUQAABYHgtBAwAAIFujAggAACzPWvU/EkAAAADLPQmEIWAAAACLIQEEAACWZ7PZMm1z1YoVK9S8eXOFhobKZrNpzpw5TvuNMRoyZIhCQkLk7e2tBg0aaPfu3S6dgwQQAAAgC7lw4YIqV66scePGpbl/5MiR+uijjzRhwgStW7dOPj4+aty4sS5fvpzuczAHEAAAWF5mVsQSEhKUkJDg1Ga322W329M8vmnTpmratGma+4wxGjNmjAYNGqSWLVtKkr744gsVKFBAc+bMUfv27dMVExVAAACATBQdHa2AgACnLTo6+rb6io2N1bFjx9SgQQNHW0BAgO6//36tWbMm3f1QAQQAAJZ3O3P10mvgwIGKjIx0artR9e9Wjh07JkkqUKCAU3uBAgUc+9KDBBAAACAT3Wy4110YAgYAAJZny8QtIxUsWFCSdPz4caf248ePO/alBwkgAADAf0Tx4sVVsGBBLV682NF29uxZrVu3TjVq1Eh3PwwBAwAAy8vMOYCuOn/+vPbs2eN4HRsbq5iYGOXJk0dFixbVK6+8orfeekulSpVS8eLFNXjwYIWGhqpVq1bpPofNGGMyIXa3unzF3REAqf1zLuHWBwF3UYuxq9wdAuBk05CH3Xbu2VuOZlrfrSuHuHT8smXL9NBDD6Vq79y5s6ZOnSpjjIYOHaqJEyfqzJkzevDBB/XJJ5+odOnS6T4HCSBwl5AAIqshAURWQwJ49zAEDAAALC8rDQHfDdwEAgAAYDFUAAEAgOVZq/5HBRAAAMByqAACAADLs9gUQCqAAAAAVkMFEAAAWJ6HxWYBkgACAADLYwgYAAAA2RoVQAAAYHk2iw0BUwEEAACwGCqAAADA8pgDCAAAgGyNCiAAALA8qy0DQwUQAADAYqgAAgAAy7PaHEASQAAAYHlWSwAZAgYAALAYKoAAAMDyWAgaAAAA2RoVQAAAYHke1ioAUgEEAACwGiqAAADA8pgDCAAAgGyNCiAAALA8q60DSAIIAAAsjyFgAAAAZGtUAAEAgOWxDAwAAACyNSqAAADA8pgDCAAAgGyNCiAAALA8qy0DQwUQAADAYqgAAgAAy7NYAZAEEAAAwMNiY8AMAQMAAFgMFUAAAGB51qr/UQEEAACwHCqAAAAAFisBUgEEAACwGCqAAADA8ngUHAAAALI1KoAAAMDyLLYMIAkgAACAxfI/hoABAACshgogAACAxUqAVAABAAAshgogAACwPJaBAQAAQLbm9gpgcnKyRo8erW+//VYHDx5UYmKi0/5Tp065KTIAAGAVVlsGxu0VwKioKI0aNUrt2rVTfHy8IiMj1bp1a3l4eGjYsGHuDg8AACDbcXsCOGPGDE2aNEmvvvqqcubMqQ4dOuizzz7TkCFDtHbtWneHBwAALMCWiVtW5PYE8NixY6pYsaIkydfXV/Hx8ZKkZs2a6aeffnJnaAAAwCoslgG6PQEsXLiwjh49KkkqWbKkFi5cKEnasGGD7Ha7O0MDAADIltyeAD722GNavHixJOnFF1/U4MGDVapUKT399NN65pln3BwdAACwAlsm/smK3H4X8DvvvOP4uV27dgoLC9Pq1atVqlQpNW/e3I2RAQAAZE9uTwCv98ADD+iBBx5wdxgAAMBCWAbmLouOjtbkyZNTtU+ePFnvvvuuGyICAADI3tyeAH766acqU6ZMqvby5ctrwoQJbogIAABYjcVuAnZ/Anjs2DGFhISkas+XL5/j7mAAAABkHLcngEWKFNGqVatSta9atUqhoaFuiAgAAFiOxUqAbr8JpHv37nrllVeUlJSkhx9+WJK0ePFi9e/fX6+++qqbowMAAFaQVZdrySxuTwD79eunuLg49ezZU4mJiZIkLy8vDRgwQAMHDnRzdAAAANmP2xNAm82md999V4MHD9bOnTvl7e2tUqVK8RQQAABw11htGRi3J4DX+Pr66r777nN3GAAAANmeWxLA1q1ba+rUqfL391fr1q1veuzs2bPvUlQAAMCqLFYAdE8CGBAQINv/r7UGBAS4IwQAAADLcksCOGXKlDR/BgAAcAuLlQDdvg4gAAAArho2bJhsNpvTltYT0+6U228COX78uPr27avFixfrxIkTMsY47U9OTnZTZNb29cwZmjblc/3zz0mVjiij114frIqVKrk7LFjU/2Z9o//N/lbHjx6RJIWVKKmnnnlO99es7ebIYBVda4Xp4TL5VCxvbiVcSdGWQ/H6aPFeHYi76DjGM4eHIhuFq1H5AvLMadOavacU/fMunbqQ5MbIkV5ZaR3A8uXL69dff3W8zpkz49M1tyeAXbp00cGDBzV48GCFhIQ45gbCfeb/8rPeHxmtQUOjVLFiZc2YPk0vPNdNc+fNV3BwsLvDgwXlzV9A3Xu9okKFi8rIaOFP/9OQ/i/r0y++VbES4e4ODxZwT1igvv39sHYcOaccHjb1friEPulURW3Gr9XlpBRJ0quNw/Vgqbwa8P12nU+4ogFNS+v9thX1zJRNbo4e/zU5c+ZUwYIFM/ccmdp7Ovz2229auXKlqlSp4u5Q8P9NnzZFrR9vq1aPtZEkDRoapRUrlmnO7Fnq1r2Hm6ODFdWsXc/pdbcXXtKPP3yrP7ZvJQHEXdF75han10Pn7tSSvrVVLsRfmw6eka89h1pVDdXrs3dow/7TkqRhc3dqdq8HVLGQv7b9fdYdYcMFmVl/SkhIUEJCglOb3W6/4ZrHu3fvVmhoqLy8vFSjRg1FR0eraNGiGRqT2+cAFilSJNWwL9wnKTFRO//YoQdq1HS0eXh46IEHamrrls1ujAy4Kjk5WUsW/aLLly6pXMXK7g4HFuVnv1o/ib90dXi3bIi/cuXw0Lp9px3H7I+7qKNnLqtSYVa7+C/IzEcBR0dHKyAgwGmLjo5OM477779fU6dO1fz58zV+/HjFxsaqdu3aOnfuXIZer9srgGPGjNFrr72mTz/9VMWKFXN3OJZ3+sxpJScnpxrqDQ4OVmzsPjdFBUj79vylF7s/pcTERHl751bUu2NUrHhJd4cFC7JJ6tu4lDYfPKO9Jy9IkoJ9PZV4JUXnE644HRt3IVHBvp5uiBJZycCBAxUZGenUdqPqX9OmTR0/V6pUSffff7/CwsL07bffqlu3bhkWk9sTwHbt2unixYsqWbKkcufOrVy5cjntP3Xq1E3fn1ZZ1eS4cVkVwH9TkbDimvjFd7pw4bxWLFmkd4cP0qjxk0kCcde99khplczvw9y+7CYTh4BvNtx7K4GBgSpdurT27NmToTG5PQEcM2bMHb0/OjpaUVFRTm1vDB6qQUOG3VG/VhUUGKQcOXIoLi7OqT0uLk558+Z1U1SAlCtXLhUqcnUOTOky5bTrj+2a/c0MRb42xM2RwUoGNCmt2qXy6tlpm3Ti3P8VH+LOJ8ozp4d87TmdqoDBPp6KO5/ojlCRTZw/f1579+7VU089laH9uj0B7Ny58x29P62yqslB9e925fL0VNly5bVu7Ro9XL+BJCklJUXr1q1R+w5Pujk64P+kmBQlJfIXK+6eAU1K66Ey+dT9i006cuay076dR88qKTlF1YsHacmfJyVJYcG5FRLopa2H490RLlyUVZaB6du3r5o3b66wsDAdOXJEQ4cOVY4cOdShQ4cMPY9bEsCzZ8/K39/f8fPNXDvuRtIqq16+coODkS5Pde6qwa8PUPnyFVShYiV9OX2aLl26pFaP3fy5zUBm+eyTD1W9Ri3lLxCiixcvaMnCX7Rl0+96Z8wEd4cGi3itaWk1rVhAfb7ZposJyQr2uTqv73zCFSVcSdH5hGTN2XxErzYqpbOXk3QhIVn9m5TWlkPx3AEMlxw+fFgdOnRQXFyc8uXLpwcffFBr165Vvnz5MvQ8bkkAg4KCdPToUeXPn1+BgYFprv1njJHNZmMhaDdo0vQRnT51Sp98/JH++eekIsqU1SeffqZghoDhJqdPn9I7UYN0Ku6kfHx9VaJkab0zZoLuvb+Gu0ODRbS9r7Ak6bPO1Zzah879Qz9uOSZJ+mDBHhkjvfdERXnm8NCavXGK/vmvux4rbk9WWYb466+/vivnsRk3rMGyfPly1apVSzlz5tTy5ctvemzdunVd7p8KILKif84l3Pog4C5qMXaVu0MAnGwa8rDbzr3r2MVbH3SbIgrmzrS+b5dbKoD/TupuJ8EDAADISFmkAHjXuP0mkK1bt6bZbrPZ5OXlpaJFi7KkCwAAyFwWywDdngBWqVLlps//zZUrl9q1a6dPP/1UXl5edzEyAACA7Mntj4L74YcfVKpUKU2cOFExMTGKiYnRxIkTFRERoZkzZ+rzzz/XkiVLNGjQIHeHCgAAsilbJv7JitxeARwxYoQ+/PBDNW7c2NFWsWJFFS5cWIMHD9b69evl4+OjV199Ve+//74bIwUAAMge3J4Abtu2TWFhYanaw8LCtG3bNklXh4mPHj16t0MDAAAWkVWWgblb3D4EXKZMGb3zzjtK/NeK/klJSXrnnXdUpkwZSdLff/+tAgUKuCtEAACAbMXtFcBx48apRYsWKly4sCpVqiTpalUwOTlZ8+bNkyTt27dPPXv2dGeYAAAgG7NYAdD9CWDNmjUVGxurGTNm6K+/rq6Y/sQTT6hjx47y8/OTpAx/ADIAAICVuTUBTEpKUpkyZTRv3jw9//zz7gwFAABYmcVKgG5NAHPlyqXLly+7MwQAAIAsu1xLZnH7TSC9evXSu+++qytXeIAvAADA3eD2OYAbNmzQ4sWLtXDhQlWsWFE+Pj5O+2fPnu2myAAAgFVYbRkYtyeAgYGBatOmjbvDAAAAsAy3J4BTpkxxdwgAAMDiLFYAdP8cQAAAANxdbqkAVqtWTYsXL1ZQUJCqVq0q200G3jdt2nQXIwMAAJZksRKgWxLAli1bym63S5JatWrljhAAAAAsyy0J4NChQx0/Hzp0SJ06ddJDDz3kjlAAAABYB/BuO3nypJo2baoiRYqof//+2rJli7tDAgAAFmOzZd6WFbk9AZw7d66OHj2qwYMHa/369apWrZrKly+vt99+W/v373d3eAAAANmO2xNASQoKClKPHj20bNkyHThwQF26dNH06dMVHh7u7tAAAIAF2DJxy4qyRAJ4TVJSkn7//XetW7dO+/fvV4ECBdwdEgAAQLaTJRLApUuXqnv37ipQoIC6dOkif39/zZs3T4cPH3Z3aAAAwAKsNgfQ7U8CKVSokE6dOqUmTZpo4sSJat68uWOJGAAAAGQ8tyeAw4YN0xNPPKHAwEB3hwIAACwri5bqMonbE8Du3bu7OwQAAABLcXsCCAAA4G5Zda5eZiEBBAAAlmex/C9r3AUMAACAu4cKIAAAsDyrDQFTAQQAALAYKoAAAMDybBabBUgFEAAAwGKoAAIAAFirAEgFEAAAwGqoAAIAAMuzWAGQBBAAAIBlYAAAAJCtUQEEAACWxzIwAAAAyNaoAAIAAFirAEgFEAAAwGqoAAIAAMuzWAGQCiAAAIDVUAEEAACWZ7V1AEkAAQCA5bEMDAAAALI1KoAAAMDyrDYETAUQAADAYkgAAQAALIYEEAAAwGKYAwgAACyPOYAAAADI1qgAAgAAy7PaOoAkgAAAwPIYAgYAAEC2RgUQAABYnsUKgFQAAQAArIYKIAAAgMVKgFQAAQAALIYKIAAAsDyrLQNDBRAAAMBiqAACAADLYx1AAAAAZGtUAAEAgOVZrABIAggAAGC1DJAhYAAAAIshAQQAAJZny8Q/t2PcuHEqVqyYvLy8dP/992v9+vUZer0kgAAAAFnIN998o8jISA0dOlSbNm1S5cqV1bhxY504cSLDzkECCAAALM9my7zNVaNGjVL37t3VtWtXlStXThMmTFDu3Lk1efLkDLteEkAAAIBMlJCQoLNnzzptCQkJaR6bmJiojRs3qkGDBo42Dw8PNWjQQGvWrMmwmLLlXcBe2fKq7r6EhARFR0dr4MCBstvt7g7nP69wEJ/hneI7mbE2DXnY3SFkC3wvs4fMzB2GvRWtqKgop7ahQ4dq2LBhqY79559/lJycrAIFCji1FyhQQH/++WeGxWQzxpgM6w3ZytmzZxUQEKD4+Hj5+/u7OxyA7ySyJL6XuJWEhIRUFT+73Z7mPxiOHDmiQoUKafXq1apRo4ajvX///lq+fLnWrVuXITFRKwMAAMhEN0r20pI3b17lyJFDx48fd2o/fvy4ChYsmGExMQcQAAAgi/D09NQ999yjxYsXO9pSUlK0ePFip4rgnaICCAAAkIVERkaqc+fOuvfee1W9enWNGTNGFy5cUNeuXTPsHCSAuCG73a6hQ4cyqRlZBt9JZEV8L5HR2rVrp5MnT2rIkCE6duyYqlSpovnz56e6MeROcBMIAACAxTAHEAAAwGJIAAEAACyGBBAAAMBiSAABZGn79++XzWZTTExMluwP/y3Dhg1TlSpV7rifZcuWyWaz6cyZM+l+T5cuXdSqVas7PjeQEbgJBNq/f7+KFy+uzZs3Z8gvRiAjJScn6+TJk8qbN69y5rzzhQv4vlvb+fPnlZCQoODg4DvqJzExUadOnVKBAgVks9nS9Z74+HgZYxQYGHhH5wYyAsvAAHCrpKQk5cqV64b7c+TIkaGr32eExMREeXp6ujsM3AZfX1/5+vrecH96/9t6enq6/L0MCAhw6XggMzEEnI18//33qlixory9vRUcHKwGDRrowoULkqTPPvtMZcuWlZeXl8qUKaNPPvnE8b7ixYtLkqpWrSqbzaZ69epJurry+PDhw1W4cGHZ7XbHOkTXJCYmqnfv3goJCZGXl5fCwsIUHR3t2D9q1ChVrFhRPj4+KlKkiHr27Knz58/fhU8CmWXixIkKDQ1VSkqKU3vLli31zDPPSJLmzp2ratWqycvLSyVKlFBUVJSuXLniONZms2n8+PFq0aKFfHx8NGLECJ0+fVqdOnVSvnz55O3trVKlSmnKlCmS0h6y3bFjh5o1ayZ/f3/5+fmpdu3a2rt3r6Rbf2/Tsnz5clWvXl12u10hISF67bXXnGKuV6+eevfurVdeeUV58+ZV48aN7+hzROa51Xf0+iHga8OyI0aMUGhoqCIiIiRJq1evVpUqVeTl5aV7771Xc+bMcfoeXj8EPHXqVAUGBmrBggUqW7asfH191aRJEx09ejTVua5JSUnRyJEjFR4eLrvdrqJFi2rEiBGO/QMGDFDp0qWVO3dulShRQoMHD1ZSUlLGfmCwLoNs4ciRIyZnzpxm1KhRJjY21mzdutWMGzfOnDt3znz55ZcmJCTEzJo1y+zbt8/MmjXL5MmTx0ydOtUYY8z69euNJPPrr7+ao0ePmri4OGOMMaNGjTL+/v7mq6++Mn/++afp37+/yZUrl/nrr7+MMca89957pkiRImbFihVm//79ZuXKlWbmzJmOmEaPHm2WLFliYmNjzeLFi01ERIR54YUX7v6Hgwxz6tQp4+npaX799VdHW1xcnKNtxYoVxt/f30ydOtXs3bvXLFy40BQrVswMGzbMcbwkkz9/fjN58mSzd+9ec+DAAdOrVy9TpUoVs2HDBhMbG2sWLVpk/ve//xljjImNjTWSzObNm40xxhw+fNjkyZPHtG7d2mzYsMHs2rXLTJ482fz555/GmFt/b9PqL3fu3KZnz55m586d5ocffjB58+Y1Q4cOdcRct25d4+vra/r162f+/PNPx7mQ9dzqOzp06FBTuXJlx77OnTsbX19f89RTT5nt27eb7du3m/j4eJMnTx7z5JNPmh07dpiff/7ZlC5d2ul7s3TpUiPJnD592hhjzJQpU0yuXLlMgwYNzIYNG8zGjRtN2bJlTceOHZ3O1bJlS8fr/v37m6CgIDN16lSzZ88es3LlSjNp0iTH/jfffNOsWrXKxMbGmv/973+mQIEC5t13382Uzw3WQwKYTWzcuNFIMvv370+1r2TJkk6JmTFXf7HUqFHDGJP6L8RrQkNDzYgRI5za7rvvPtOzZ09jjDEvvviiefjhh01KSkq6Yvzuu+9McHBwei8JWVTLli3NM88843j96aefmtDQUJOcnGzq169v3n77bafjp0+fbkJCQhyvJZlXXnnF6ZjmzZubrl27pnm+67+fAwcONMWLFzeJiYlpHn+r7+31/b3++usmIiLC6Xs8btw44+vra5KTk40xVxPAqlWr3ugjQRZzs+9oWglggQIFTEJCgqNt/PjxJjg42Fy6dMnRNmnSpFsmgJLMnj17HO8ZN26cKVCggNO5riWAZ8+eNXa73Snhu5X33nvP3HPPPek+HrgZhoCzicqVK6t+/fqqWLGinnjiCU2aNEmnT5/WhQsXtHfvXnXr1s0x98XX11dvvfWWY8gsLWfPntWRI0dUq1Ytp/ZatWpp586dkq4OZ8TExCgiIkIvvfSSFi5c6HTsr7/+qvr166tQoULy8/PTU089pbi4OF28eDHjPwDcNZ06ddKsWbOUkJAgSZoxY4bat28vDw8PbdmyRcOHD3f6rnXv3l1Hjx51+u9+7733OvX5wgsv6Ouvv1aVKlXUv39/rV69+obnj4mJUe3atdOcN5ie7+31du7cqRo1ajhN5K9Vq5bOnz+vw4cPO9ruueeem3wqyEpu9h1NS8WKFZ3m/e3atUuVKlWSl5eXo6169eq3PG/u3LlVsmRJx+uQkBCdOHEizWN37typhIQE1a9f/4b9ffPNN6pVq5YKFiwoX19fDRo0SAcPHrxlHEB6kABmEzly5NCiRYv0yy+/qFy5cho7dqwiIiK0fft2SdKkSZMUExPj2LZv3661a9fe0TmrVaum2NhYvfnmm7p06ZLatm2rxx9/XNLVeVvNmjVTpUqVNGvWLG3cuFHjxo2TdHXuIP67mjdvLmOMfvrpJx06dEgrV65Up06dJF29wzIqKsrpu7Zt2zbt3r3b6S9THx8fpz6bNm2qAwcOqE+fPjpy5Ijq16+vvn37pnl+b2/vzLu4m7g+ZmRdN/uOpiWj/tte/48Sm80mc4OFNm71PV6zZo06deqkRx55RPPmzdPmzZv1xhtv8PsTGYYEMBux2WyqVauWoqKitHnzZnl6emrVqlUKDQ3Vvn37FB4e7rRdu/nj2r98k5OTHX35+/srNDRUq1atcjrHqlWrVK5cOafj2rVrp0mTJumbb77RrFmzdOrUKW3cuFEpKSn64IMP9MADD6h06dI6cuTIXfgUkNm8vLzUunVrzZgxQ1999ZUiIiJUrVo1SVf/UbBr165U37Xw8PAbVl+uyZcvnzp37qwvv/xSY8aM0cSJE9M8rlKlSlq5cmWak+HT+739t7Jly2rNmjVOf1GvWrVKfn5+Kly48E1jRtZ0s+9oekRERGjbtm2OCqIkbdiwIUNjLFWqlLy9vbV48eI0969evVphYWF64403dO+996pUqVI6cOBAhsYAa2MZmGxi3bp1Wrx4sRo1aqT8+fNr3bp1OnnypMqWLauoqCi99NJLCggIUJMmTZSQkKDff/9dp0+fVmRkpPLnzy9vb2/Nnz9fhQsXlpeXlwICAtSvXz8NHTpUJUuWVJUqVTRlyhTFxMRoxowZkq7e5RsSEqKqVavKw8ND3333nQoWLKjAwECFh4crKSlJY8eOVfPmzbVq1SpNmDDBzZ8SMkqnTp3UrFkz7dixQ08++aSjfciQIWrWrJmKFi2qxx9/3DEsvH37dr311ls37G/IkCG65557VL58eSUkJGjevHkqW7Zsmsf27t1bY8eOVfv27TVw4EAFBARo7dq1ql69uiIiIm75vb1ez549NWbMGL344ovq3bu3du3apaFDhyoyMvKWSSuyrht9R9OjY8eOeuONN9SjRw+99tprOnjwoN5//31JSveaf7fi5eWlAQMGqH///vL09FStWrV08uRJ7dixQ926dVOpUqV08OBBff3117rvvvv0008/6YcffsiQcwOSuAs4u/jjjz9M48aNTb58+YzdbjelS5c2Y8eOdeyfMWOGqVKlivH09DRBQUGmTp06Zvbs2Y79kyZNMkWKFDEeHh6mbt26xhhjkpOTzbBhw0yhQoVMrly5TOXKlc0vv/zieM/EiRNNlSpVjI+Pj/H39zf169c3mzZtcuwfNWqUCQkJMd7e3qZx48bmiy++cJo0jf+u5ORkExISYiSZvXv3Ou2bP3++qVmzpvH29jb+/v6mevXqZuLEiY79kswPP/zg9J4333zTlC1b1nh7e5s8efKYli1bmn379hlj0r5JacuWLaZRo0Ymd+7cxs/Pz9SuXdsRx62+t2n1t2zZMnPfffcZT09PU7BgQTNgwACTlJTk2F+3bl3z8ssv3+GnhrvpRt/RtG4C+fedudesWrXKVKpUyXh6epp77rnHzJw500hy3AGe1k0gAQEBTn388MMP5t9/zV5/ruTkZPPWW2+ZsLAwkytXLlO0aFGnm6j69etngoODja+vr2nXrp0ZPXp0qnMAt4sngQAAcAszZsxQ165dFR8f77Z5qEBGYggYAIDrfPHFFypRooQKFSqkLVu2aMCAAWrbti3JH7INEkAAAK5z7NgxDRkyRMeOHVNISIieeOIJp6d0AP91DAEDAABYDLe4AQAAWAwJIAAAgMWQAAIAAFgMCSAAAIDFkAACAABYDAkggNvWpUsXtWrVyvG6Xr16euWVV+56HMuWLZPNZtOZM2cy7RzXX+vtuBtxAkB6kAAC2UyXLl1ks9lks9nk6emp8PBwDR8+XFeuXMn0c8+ePVtvvvlmuo6928lQsWLFNGbMmLtyLgDI6lgIGsiGmjRpoilTpighIUE///yzevXqpVy5cmngwIGpjk1MTJSnp2eGnDdPnjwZ0g8AIHNRAQSyIbvdroIFCyosLEwvvPCCGjRooP/973+S/m8oc8SIEQoNDVVERIQk6dChQ2rbtq0CAwOVJ08etWzZUvv373f0mZycrMjISAUGBio4OFj9+/fX9evIXz8EnJCQoAEDBqhIkSKy2+0KDw/X559/rv379+uhhx6SJAUFBclms6lLly6SpJSUFEVHR6t48eLy9vZW5cqV9f333zud5+eff1bp0qXl7e2thx56yCnO25GcnKxu3bo5zhkREaEPP/wwzWOjoqKUL18++fv76/nnn1diYqJjX3pi/7cDBw6oefPmCgoKko+Pj8qXL6+ff/75jq4FANKDCiBgAd7e3oqLi3O8Xrx4sfz9/bVo0SJJUlJSkho3bqwaNWpo5cqVypkzp9566y01adJEW7dulaenpz744ANNnTpVkydPVtmyZfXBBx/ohx9+0MMPP3zD8z799NNas2aNPvroI1WuXFmxsbH6559/VKRIEc2aNUtt2rTRrl275O/v73jGanR0tL788ktNmDBBpUqV0ooVK/Tkk08qX758qlu3rg4dOqTWrVurV69e6tGjh37//Xe9+uqrd/T5pKSkqHDhwvruu+8UHBys1atXq0ePHgoJCVHbtm2dPjcvLy8tW7ZM+/fvV9euXRUcHOx4RNitYr9er169lJiYqBUrVsjHx0d//PGHfH197+haACBdDIBspXPnzqZly5bGGGNSUlLMokWLjN1uN3379nXsL1CggElISHC8Z/r06SYiIsKkpKQ42hISEoy3t7dZsGCBMcaYkJAQM3LkSMf+pKQkU7hwYce5jDGmbt265uWXXzbGGLNr1y4jySxatCjNOJcuXWokmdOnTzvaLl++bHLnzm1Wr17tdGy3bt1Mhw4djDHGDBw40JQrV85p/4ABA1L1db2wsDAzevToG+6/Xq9evUybNm0crzt37mzy5MljLly44GgbP3688fX1NcnJyemK/fprrlixohk2bFi6YwKAjEIFEMiG5s2bJ19fXyUlJSklJUUdO3bUsGHDHPsrVqzoNO9vy5Yt2rNnj/z8/Jz6uXz5svbu3av4+HgdPXpU999/v2Nfzpw5de+996YaBr4mJiZGOXLkSLPydSN79uzRxYsX1bBhQ6f2xMREVa1aVZK0c+dOpzgkqUaNGuk+x42MGzdOkydP1sGDB3Xp0iUlJiaqSpUqTsdUrlxZuXPndjrv+fPndejQIZ0/f/6WsV/vpZde0gsvvKCFCxeqQYMGatOmjSpVqnTH1wIAt0ICCGRDDz30kMaPHy9PT0+FhoYqZ07n/9V9fHycXp8/f1733HOPZsyYkaqvfPny3VYM14Z0XXH+/HlJ0k8//aRChQo57bPb7bcVR3p8/fXX6tu3rz744APVqFFDfn5+eu+997Ru3bp093E7sT/77LNq3LixfvrpJy1cuFDR0dH64IMP9OKLL97+xQBAOpAAAtmQj4+PwsPD0318tWrV9M033yh//vzy9/dP85iQkBCtW7dOderUkSRduXJFGzduVLVq1dI8vmLFikpJSdHy5cvVoEGDVPuvVSCTk5MdbeXKlZPdbtfBgwdvWDksW7as44aWa9auXXvri7yJVatWqWbNmurZs6ejbe/evamO27Jliy5duuRIbteuXStfX18VKVJEefLkuWXsaSlSpIief/55Pf/88xo4cKAmTZpEAggg03EXMAB16tRJefPmVcuWLbVy5UrFxsZq2bJleumll3T48GFJ0ssvv6x33nlHc+bM0Z9//qmePXvedA2/YsWKqXPnznrmmWc0Z84cR5/ffvutJCksLEw2m03z5s3TyZMndf78efn5+alv377q06ePpk2bpr1792rTpk0aO3aspk2bJkl6/vnntXv3bvXr10+7du3SzJkzNXXq1HRd599//62YmBin7fTp0ypVqpR+//13LViwQH/99ZcGDx6sDRs2pHp/YmKiunXrpj/++EM///yzhg4dqt69e8vDwyNdsV/vlVde0YIFCxQbG6tNmzZp6dKlKlu2bLquBQDuiLsnIQLIWP++CcSV/UePHjVPP/20yZs3r7Hb7aZEiRKme/fuJj4+3hhz9aaPl19+2fj7+5vAwEATGRlpnn766RveBGKMMZcuXTJ9+vQxISEhxtPT04SHh5vJkyc79g8fPtwULFjQ2Gw207lzZ2PM1RtXxowZYyIiIkyuXLlMvnz5TOPGjc3y5csd7/vxxx9NeHi4sdvtpnbt2mby5MnpuglEUqpt+vTp5vLly6ZLly4mICDABAYGmhdeeMG89tprpnLlyqk+tyFDhpjg4GDj6+trunfvbi5fvuw45laxX38TSO/evU3JkiWN3W43+fLlM0899ZT5559/bngNAJBRbMbcYAY3AAAAsiWGgAEAACyGBBAAAMBiSAABAAAshgQQAADAYkgAAQAALIYEEAAAwGJIAAEAACyGBBAAAMBiSAABAAAshgQQAADAYkgAAQAALOb/ASvvYt3jdBbsAAAAAElFTkSuQmCC", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "import seaborn as sns\n", "from sklearn.metrics import confusion_matrix\n", "\n", "model.eval()\n", "\n", "# Get predictions\n", "with torch.no_grad():\n", " outputs = model(X_test)\n", " _, predicted = torch.max(outputs, 1)\n", "\n", "# Compute the confusion matrix\n", "cm = confusion_matrix(y_test.numpy(), predicted.numpy())\n", "\n", "# Plotting the confusion matrix\n", "plt.figure(figsize=(8, 6))\n", "sns.heatmap(cm, annot=True, fmt='g', cmap='Blues', xticklabels=iris.target_names, yticklabels=iris.target_names)\n", "plt.xlabel('Predicted Labels')\n", "plt.ylabel('True Labels')\n", "plt.title('Confusion Matrix')\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Although the test set (and training set!) is relatively small for this problem, and so conclusions can't be made with high-confidence, it seems that our model primarily makes mistakes where it misclassifies \"virginica\" iris plants as \"versicolor\" iris plants.\n", "\n", "**Note**: If this confusion matrix were to be normalized, each cell would be divided by the sum of the values in the same **row**. Hence, the entries $(i,j)^\\text{th}$ row indicate how likely a point of class $i$ will be given each label by the model." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 3. Precision, Recall, and F1-Score\n", "\n", "For binary classification tasks, where one label is viewed as the \"positive\" label (e.g., $1$ when labels are $0$ and $1$), statistics like precision, recall, and the F1 score are often used to evaluate the model.\n", "\n", "These metrics are often expressed in terms of the following statistics:\n", "1. **True Positive (TP)**: The number of points (rows) with label $1$ and where the model predicted $1$.\n", "2. **False Positive (FP)**: The number of points (rows) with label $0$, but where the model predicted $1$.\n", "3. **False Negative (FN)**: The number of points (rows) with label $1$, but where the model predicted $0$.\n", "4. **True Negative (TN)**: The number of points (rows) with label $0$ and where the model predicted $0$.\n", "\n", "**Precision** measures the ratio of the correctly predicted positive labels to the total predicted positives. That is:\n", "$$\n", "\\text{Precision}=\\frac{\\text{TP}}{\\text{TP}+\\text{FP}}.\n", "$$\n", "\n", "**Recall** measures the ratio of the correctly predicted positive labels to the total number of positives. That is:\n", "$$\n", "\\text{Recall}=\\frac{\\text{TP}}{\\text{TP}+\\text{FN}}.\n", "$$\n", "\n", "For stochastic classifiers, these definitions can be generalized to account for the probability of each possible error. For example, precision is given by:\n", "$$\n", "\\text{Precision}=\\Pr(Y_i=1 | \\hat Y_i = 1),\n", "$$\n", "and \n", "$$\n", "\\text{Recall}=\\Pr(\\hat Y_i=1 | Y_i = 1).\n", "$$\n", "\n", "The **F1 score** combines the precision and recall, and is given by the equation:\n", "$$\n", "\\text{F}_1\\text{ Score}=2 \\frac{\\text{precision}\\cdot \\text{recall}}{\\text{precision} + \\text{recall}}.\n", "$$\n", "This is the *harmonic mean* of the precision and recall, which places more weight on low values relative to the more common arithmetic mean.\n", "\n", "The F1 score ranges from 0 to 1, where 1 denotes perfect precision and recall, and 0 means that either precision or recall is zero.\n", "\n", "We won't compute these scores for our model, since it is not a binary classification model." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 4. ROC and AUC" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The **Receiver Operating Characteristic (ROC)** curve and the **Area Under the ROC Curve** (AUC) are two of the most common metrics for binary classification problems. **Unlike the other metrics, these methods assume that the parametric model's output is compared to a threshold. If the output is above the thresold, a label of 1 is predicted, and otherwise the label 0 is predicted.** By tuning this threshold parameter, you can adjust the tradeoff between different types of errors.\n", "\n", "For example, if you are observing too many false positives, then the model is outputting the label 1 too often, and so the threshold can be increased. This will typically reduce the number of false positives, but may increase the number of false negatives.\n", "\n", "ROC and AUC measure this trade-off. Unlike the other statistics that have been discussed, they aren't measuring the performance of a single parametric model for making predictions, but a range of parametric models (those that result from using different thresholds with your trained model).\n", "\n", "Here is an example three ROC curves from Wikipedia. The horizontal axis depics the false positive rate: $\\text{FPR}=\\frac{\\text{FP}}{\\text{FP}+\\text{TN}}$, while the vertical axis depicts the true positive rate: $\\text{TPR}=\\frac{\\text{TP}}{\\text{TP}+\\text{FN}}$. Each point on the curve indicates that there is some threshold for selecting the positive label as the prediction of the model that results in the specified FPR and TPR:\n", "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The dashed line denotes the ROC curve that results from a random classifier (a model that outputs a sample from the continuous uniform distribution on $[0,1]$ without considering the input at all). Curves above this line are doing better than a random classifier, while curves below are doing worse than a random classifier (this shouldn't happen for any decent classifier!)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notice that higher values are better, as they indicate that for any given FPR, the model can achieve a better TPR. \n", "\n", "The AUC is a statistic that summarizes a ROC curve by computing the area undernear the curve. For a perfect model, the ROC curve would have a TPR of one for all possible values of FPR, and hence the AUC (area under the ROC curve) would be one. A pessimal model (one that gets all predictions wrong) would have an AUC of zero. The random classifier achieves an AUC of 0.5." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.7" } }, "nbformat": 4, "nbformat_minor": 2 }